Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well people barely work well together. The world needs to learn how to treat oth…
ytc_UgyigZsSr…
G
consider the inverse could be a problem as well, imagine the AI thinking a gun i…
ytc_Ugyo1DUN6…
G
AI is not creative it just gathers information and uses it so whatever they use …
ytc_Ugzpwix4_…
G
A simulation theory. AI developed to the point no one can understand it. All job…
ytc_UgyI645RY…
G
@MTMC1111 only Three states in america have laws in regards to deepfakes. Nowher…
ytr_UgwW3Kn9k…
G
There is probably also a huge conflict of incentive in how AI is fundamentally d…
ytc_UgzzVgANR…
G
"yo u can replace someone with 20$ subscription" the point he is not talking abo…
ytc_UgwIZ0hFP…
G
As an engineer ai can’t be original so when company’s get the same 3 building de…
ytc_UgwVfE-B9…
Comment
1984 movie - Terminator- ( which is never referenced) in AI projects revealed a potential future scenario as outlined in its script. One line in particular was the scene where Kyle Reese is explaining to Sarah Connor the future to come that was controlled by AI Skynet control dominance. “ …and then they got smart and saw all humans as a threat!”
youtube
AI Moral Status
2025-04-29T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx_-6KtoSQgC2HZi314AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzBPBng8plC2oIR6-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4MzsMHj9vCsT9Ye14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHkeKPT4_tChCecIF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugza1M470QAUCMwgGJl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx9uUdltvoSwbHSA7Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz56TDAFpZf9tazgcR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrC932-ziXt1euMrZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxr1ubC1_He6AoAnQJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx7YCJGd4adonl_blR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"}]