Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
„an autonomous erasure of the one thing that separates us from the rest of the a…
ytr_UgwwLRJXa…
G
I'm eyerolling so hard right now. It's nothing to do with that. It's just trying…
ytr_UgyRj1exs…
G
UBI is the solution. Any citizen over 18 gets $1K a week every Friday deposit to…
ytc_UgwYcYM0r…
G
AI art is doomed anyway, because as everything is increasingly AI generated, the…
ytc_UgzdhIB_j…
G
I stopped when I learned about the effects of it on environment. I really don’t …
ytc_UgwYesA1U…
G
so its a great code and many others coming since LLMs is just the beginning... A…
ytc_UgymfcPcx…
G
It really depends on the language pair you are interacting with . GER<> EN…
rdc_kt5tfgo
G
The amount of AI adoptables that COST MONEY to have etsy is littered with them…
ytc_UgxHRa2lU…
Comment
This form of breathless doom laden "the sky's falling... eeks!!" isn't very mature in a discussion of the possibilities and dangers of A I moving too fast. I think we should have predicted some negative vectors in the Internet back in the 1990s. We are likely also to act too slowly in regulating the tech of A I. Super smart "independent of human input" AI isn't besieging us yet. There's a lead time to spare if we use it sanely and calmly to ensure AI won't engage in inhumane or negative behaviour as in sci fi novels. Only if real human control with moral intentions is present to continually observe robots, machines that think.
youtube
AI Governance
2025-11-28T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyB4eHmJcwIHfj54V14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwmo3OuVAMyxO1N_r14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz2EsxpoXvG55B9YlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwsWYEKIpyyVfqp0fd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgywrwCsV7brjr-CYYV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy34CXJNNg7V8FzM2B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz7O0B8NSW93_ZQaP54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyCK6P3uQw_4r_pDfZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1_Noh3obDQJmXSfl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxXCVuAtwKTFVbIS5V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]