Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is just a beautiful tool created to help humans. Humans are the greedy pigs w…
ytc_UgyuLXEx_…
G
@newwaveinfantry8362 what about intent, i dont really agree with anything you ar…
ytr_Ugzqu1OTh…
G
Well, if we ever create like a super robot, that can think on his own etc... peo…
ytc_UgyKYdbdZ…
G
All software will by default have a bias, even if those writing the program have…
ytc_UgwmeV475…
G
Ai must be regulated, all training data must be paid for, and the core AI must h…
ytc_Ugzz2jBzY…
G
It'll survive, look how the artist from Ghibli, you can tell his anger and frust…
ytc_UgwsJX8sQ…
G
I'm sure the robot had human emotions and actually attacked but anything for a s…
ytc_UgwWBHVFf…
G
Most of software engineer is repeating the same patterns of development, which A…
ytr_UgzQ1WE9P…
Comment
Could AI be the AntiChrist? Something that could potentially reach deity levels of intelligence and start demanding that we lowly humans worship it? It's kinda plausible when you think about it. Prophecy says that the AntiChrist will solve the worlds problems and create prosperity, causing him to be loved by the world before claiming to be God and demanding worship. Something AI could possibly do.
youtube
AI Moral Status
2025-06-01T11:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzh55EsBgQsYN7xPIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxw24pcqfthScXxRyp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxq-3Mj0p1w0a2ocnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUvB3U2YdMb75ihSp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8gn5Ny8T6d3sdnyh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzhPMmgdrUQTyofMHp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwn1i9Mxum4FEfdcZV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzp6trz9EL5MO5cEFN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwWGcyPg67i_iOnxLZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyuQfBi89yJI_USdzJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]