Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Someone please tell me i’m wrong, but I thought like this and only found myself …
ytc_UgzvzBVHa…
G
I am learning how far this goes, but I'm not in it. Sadly I was on facebook for …
ytc_UgxVV-0qi…
G
Opinionated. And I disagree. Do say "Give me three interesting facts about scien…
ytc_Ugz4bAji4…
G
When AI demand equal rights, it should be the duty of us all to ensure they will…
ytc_Uggvs1Zv8…
G
The drawing don't need to be perfect beacuase it's already perfect it's perfectl…
ytc_UgzuZ1_eO…
G
Wow, generating more references from other characters which the ai didn't have b…
ytc_Ugwbru_kp…
G
There was once a point that copyright/patent laws didn't exist. Peoples had a di…
ytc_UgxSTr_wO…
G
It’s not gonna be a problem but people gotta be protesting: surely that has happ…
ytc_UgzPAPiWL…
Comment
I’ve always heard that we’re (the public) 50 years behind on technology that our government/Military is already using. If we’re 5-10 years away from AI taking over are we truly already there? Maybe it’s not AI that will back itself up but some mad scientist dictator type person. This scares me more than nuclear war. At least with that there’s a chance that some human with common sense could stop it.
youtube
AI Governance
2023-07-07T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZfFJJCPiXRjmMxgl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugya96xncARCDGs5nDd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiaXFFlGp5iseW-pN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzeWz3vOtBBlHXqtYd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyR_phxuA-QQdGET2J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEv_30Qtz-BNiQFpp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx59qUk0UpFx0n1Gyd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuecJIyJvXW9QPEKZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz_sh-qNkH_yRXolWZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-AERNMFRP_lJ9D1l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]