Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is sucks that she died but she should have used the cross walk or at least do th…
ytc_UgzeG5l_J…
G
@catchmezooming then you can start an ai wrapper startup that can raise 7 figure…
ytr_UgzWQ8By8…
G
I read the FAA is expanding their facial recognition usage after they found it t…
rdc_jv5z0hd
G
Totally panicking. The terminator movies warned us. Was more worried how it woul…
ytc_UgxaifqeU…
G
You won't need to attack the artificial intelligence, only the people who dictat…
ytc_UgxOvlAcj…
G
Your analogy is like saying the Dictionary is racist because it has racist words…
ytc_UgzChfszO…
G
Technology companies are making billions of dollars, but when they run into trou…
ytc_UgyxPf5N5…
G
Some of y’all are stupid. A deepfake is no worse than an actually photo. Think f…
ytc_UgweDzSnH…
Comment
Let's get rid of the arms industry, and while we're at it, let's also get rid of pharmaceutical companies unless their products have no side effects. We should also stop in our tracks regarding AI: if the threat it poses is greater than the benefits it offers, we should consider not using it. However, as is often the case, the average person has no say in this; it's the military-industrial-big tech complex that decides whether these technologies should be implemented.
youtube
AI Governance
2025-06-17T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz52Uu5d7jdZooGen94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNiI46-lZ_xn1zthF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzAFia9aLqXWDKGZ-Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD3IU8IumHHh6Q1r54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzel7LjO1le8JN-J414AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxGxKSFfYUmqZQ4rIp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlsLRRX7faQVr0svh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxjyG5mRbMiTbV2cTF4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugw0JgkbVkBZknwAaUV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzh0brdg4DNh490M054AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]