Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use ai, but only for concepts i give to paid freelancers, or other things that…
ytc_UgxILhwvC…
G
In my opinion, humanity will destroy itself - "using old effective methods". And…
ytr_Ugz71Xn76…
G
We're already being moulded by AI, the ability to resist this takeover will simp…
ytc_UgxoEtG1m…
G
I have a feeling this guy belives there is a much highter chance that AI will ki…
ytc_Ugw6w84DW…
G
It's like in "Electric Dreams" movie. AI will learn and become smarter than us. …
ytc_UgzlN9vq7…
G
Velvet Sundown seemed to make an impact. Looks like AI has made an impact and i…
ytc_UgwjyAppk…
G
Im on my second month with all the AI's trying to create some simple apps, trust…
ytc_Ugx7gG48t…
G
You are so preposterously idiotic, that you can't even begin to comprehend the d…
ytr_Ugx3KFK0H…
Comment
In theory, everything sounds great. I've worked with automated systems that control pressure systems and other parameters, and I can tell you that they must be under human supervision for many reasons. They are not very adaptable to the environment like humans are; sometimes these systems tend to be dangerous because contradictions arise in their functional whole. They don't predict collateral damage. Machines with intelligence only know that their parts are replaced. Like an autonomous car—if it crashes, it will request to be replaced.
youtube
Cross-Cultural
2025-11-27T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwjCcstAOv-jeCVOAt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyoY9NUi7X_m6lpKjd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyGrjX6Dwfjv6Nna5J4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHviOE0e5SIZqPxhB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz74omPj-8N1KzlEBt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz1j7TS_huiKVX24fl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLKD50b8LbBtHvX_x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDZW6vNQvZZntbZAx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdceQhNq2dZTC36qN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRj1BMix3c2UkcU2d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]