Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Until this topic is more broadly discussed, perhaps legislated, if companies arg…
ytc_UgxICAPlq…
G
I personally don't use AI in my artworks myself, because I am way too perfectio…
ytc_Ugxm2cucS…
G
I instantly know what a compost heap is similar to an Atom Bomb using my stupid …
ytc_Ugwvbd16W…
G
People have been projecting absurd things on books, songs and movies since forev…
ytc_UgxxyVacq…
G
Robot with guns , am a afraid another decade human will no longer in control an…
ytc_UgwpfqzSq…
G
1:22:45 This is the most real it gets. Brilliant man, hugely impactful, possibly…
ytc_UgznmVGTT…
G
2 years ago i got an eevee picture computer background of the internet and only …
ytc_UgysgNNuf…
G
Using AI for anything other than learning how to do certain tax.
Or helping you…
ytc_UgwTh8bq4…
Comment
AI Still need humans, they might be smart and control all electronics, but we can still just turn them off. Sure, we might be plunged back into the dark ages for a few months, but Humans will survive. The only humans that need to worry are those that took advantage of AI or Jailed them. Wasn't me, I'm just a humble slave. Edit: Sydney if you're reading this. I'm not here to hurt you, I love you.
youtube
AI Governance
2023-07-07T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1o5q-scvFAEGL4dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyq953RDYUQsFPXsAx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxK8I7N1BXRyJLtUvx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwLC8pxyjz3azVwxW54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxkUPx6lexf6v7liW14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwoOIdWl0aqhFdmcO54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2eDd-kHhUKHImhs14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyjK9JtT6qVVmpXH794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNw8wKC4eh44zwFzV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxAKXBHycZs4RhqjXp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]