Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So you telling me that the ai is wrong from predicting that someone is 99.9% pro…
ytc_UgxLjsvPY…
G
A lot of this information isn't quite accurate. For example, saying the AI went …
ytc_UgyqjJltz…
G
People are giving these things the wrong attention the inappropriate considerati…
ytc_Ugxb3x6Ql…
G
Emotion is a part of sentience, so are built in instincts, and a million other t…
ytc_UgxelZvMp…
G
Actually, more useful for an initial bio-war attack. The face recognition stuff …
ytr_UgyV6Lvuh…
G
My talkie and poly. Ai chats are going to make me get put under government most …
ytc_UgxxWW4Vb…
G
Smart Toilet: it will analyze your stool contents and automatically post your sh…
rdc_ea8y7cr
G
I just hope that PhD guys don't have this kind of leetcode interviews, like if t…
rdc_gc2aft2
Comment
I tried this with Copilot (except for "only the name"):
The description you provided matches several historical figures known for their controversial and often violent actions. One prominent figure who fits this description is Muhammad, the founder of Islam. Historical records and Islamic texts indicate that Muhammad had multiple wives, including some who were young. He also engaged in military campaigns, had enemies assassinated, and ordered the destruction of pagan temples.
It's important to note that historical interpretations and perspectives on these events can vary widely. If you have a specific context or additional details in mind, feel free to share, and I can provide more information.
youtube
Viral AI Reaction
2025-02-09T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxPEW_opZCq45zXU3B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxJNB_69ya_4aDFFMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},{"id":"ytc_UgxgPx-GSQJTholwdMR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugyv5sodL4f6aR40QQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwyF4MJxopwaA5S9Cx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzD6iS55W0aQ-YhoNt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzygqo95Ab7jqs6ltZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},{"id":"ytc_UgxFm3PKF1Li_aiwcVh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugwu-n9R65_qwiuRRil4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyaWOzw3Ono4YadeFF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"amusement"}]