Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Remember I Robot when Will Smith was in his car and he took it out of autonomous…
ytc_Ugwc0gOfT…
G
Google Engineer: Hey I think this AI is sentient.
Google CEO: Shhhhh..... We are…
ytc_UgyU6MV7b…
G
If generating AI art makes him an artist, then calling 911 makes me a police off…
ytc_UgxRC9Sbs…
G
If you hit the robot with a taser it probably short circuit the computer on boar…
ytc_UgwSOqYfn…
G
The thing with A.I is that it's a snowball effect. Once a milestone is reached, …
ytc_Ugxn4fRnv…
G
I'm at counselor, we use ai for a lot of paperwork thankfully, but the actual se…
ytc_UgxfKUXw0…
G
🇺🇸When will Congress make law creating super intelligent USA ai president & ai l…
ytc_UgzVv-6Fw…
G
I once asked AI, which will it see as a bigger threat, Humans OR Another AI - it…
ytc_UgxmdGh0v…
Comment
Its because he knows. He built much of it and tried to control it. The hisory of man should have been looked at closer first and listened to. Who built AI?
youtube
AI Governance
2024-05-29T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxuoSStT8KF8CM70Ll4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy5mCfOWcfbVmt16714AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxqHbfh8JOcJ2jVoX54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5aoRfYyHGakpJEn54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlltMWvBsDsRwfpeF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwoliC-BUbTiPtQ7kF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRqlF34rrwesNXqOR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8Oh4u66imyZsF2Qp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxOGvA6J9YvTMlw8UZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxP3A68SwmjrG6MrnF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"ban","emotion":"mixed"}
]