Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m not the biggest Musk fan, but the guy begged 10yrs ago to bring sanctions on…
ytc_Ugw67YyrW…
G
Even when you have an autopilot that is safer than humans, that actually makes i…
ytc_UgxbQJn5b…
G
What it scare me about AI is that they are trying to make it thing like us human…
ytc_Ugxbe_XGo…
G
AI robots will not have a soul. Or a spirit. That is unique to humans. All the o…
ytc_UgzjDBInm…
G
my character AI chats are either me being in wholesome relationships
pushing ev…
ytc_UgyN1Di_T…
G
@Shinkajo Yudkowsky advocated for airstriking data centres of nations that were …
ytr_UgxhOnrWN…
G
AI art only looks cool, if you zoom in.. it's just a bunch of mess.…
ytc_UgznU1dou…
G
Asked a art ai to make a man and it kept giving me women to trust ai with a law …
ytc_Ugyotn5LC…
Comment
The idea doesnt seem appealing and the fact that the engineers by using the AI tools are also causing their own death by retraining the model is very strange.
What if x amount of people join the forces either manually or via some different AI tools and they start retraining these models with false information by accepting hallucinatons and declining something thats "proven" to be correct? What then?
youtube
AI Jobs
2026-02-25T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzf5tkuXrd7kq9Vyj94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRrjz4RwUBynMMCCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxCgk2xgxJfcMDhcr94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx58tFhcIe1DQSMwt54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgymOlzqXEApzHqkAm94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyW4F7nwt-I9Phn3kl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxf2sQa9bmzUe-45P14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwIdflZcgU5Mg728P94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJMuUdzzVcFqfPtgV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUHmdyyzlcnkguyhJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]