Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
THIS. So much stupid lately and YouTube KNOWS it. Regular yt, Premium yt, and yt…
ytc_UgydjzLlI…
G
Problem is we dont have super intelligence in our ai , we dont even have intelli…
ytc_Ugz4RiBRM…
G
Shut the program now and shut off the AI. These people are playing with fire. We…
ytc_Ugxa9WwVa…
G
Imagine having a noble price winner and giving such a lame surface level intervi…
ytc_UgxASlaUf…
G
Do artists ever stop to ask who is born believing they should grow food for arti…
ytc_UgyXmrpV4…
G
AI can't replace litigation attorneys specifically because AI lacks logical reas…
ytc_UgypNmA1R…
G
Biology is awesome. Reminds me of the people trying to grow meat in a lab ending…
rdc_lp7xuol
G
+Chad Cansler But we 'think' we're alive too. We could be simulations ourselves,…
ytr_UgiCjS6CN…
Comment
The fastest way to know if a religion is false or not, is if the reward of following that religion benefits the spirit or the desires of the flesh.
Having dominion over any child of God, ( woman, man, friend, or enemy) is a desire of the flesh.
If your religion wants to destroy your brother or sister in God, instead of show them love and compassion, because they don't agree in your teachings, then you are following the desires of the flesh, and false prophet
youtube
Viral AI Reaction
2025-02-02T18:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxPEW_opZCq45zXU3B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxJNB_69ya_4aDFFMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},{"id":"ytc_UgxgPx-GSQJTholwdMR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugyv5sodL4f6aR40QQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwyF4MJxopwaA5S9Cx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzD6iS55W0aQ-YhoNt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzygqo95Ab7jqs6ltZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},{"id":"ytc_UgxFm3PKF1Li_aiwcVh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugwu-n9R65_qwiuRRil4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyaWOzw3Ono4YadeFF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"amusement"}]