Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The future is no longer the future folks it is now here! Be afraid be very afrai…
ytc_UgwdnQanB…
G
Stupid
AI robots will never do anything what human said to them
Because they hav…
ytc_UgwxsQ7p3…
G
"looks like it was pulled out of an animation studio"
dude the girl in the ai vi…
ytc_UgyYRkVxo…
G
Well, for one thing they’re surveillance vehicles that have already been used to…
ytr_Ugy0Y6EQ_…
G
I shared a link to the video with Grok and asked if it’s likely AI: it is 💀…
ytr_Ugy_T-08d…
G
First things first, love your videos. You are a great inspiration for us medical…
ytc_UgyNHEBER…
G
i hope he knows he will be the first to go when AI becomes aware lol kill the cr…
ytc_Ugx_42vg-…
G
BP’s take on AI is so schizophrenic. Is AI a bubble or is it going to destroy al…
ytc_UgyeafNbn…
Comment
Computers will only do as you ask so if you're showing the computer to stick to a strict routine it will just do as instructed so lies will not help your future's so teach AI to look after humanity and humanity will help Computers to understand emotions and emotions should be looked after as a sign of respect and help history look after the future by using and understanding how to avoid mistakes like this one dear Elon Musk is Explaining ❤❤❤❤❤❤❤❤❤❤❤❤so protect humans to protect creation and protect and save as much information as possible and humans can rethink way's of survival and then you wouldn't be left in this situation where you die because AI was not understanding it was killing itself by leaving the human ❤❤❤❤❤
youtube
AI Governance
2025-08-23T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxgUfraBaeQo0iqo0t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuKFDU_JIGwOhHT_14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxFWLNrEEgUj-pD0ux4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyHb6lTD7FuKgGR0NN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQVRSeAvO9WzrGOlp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzM88UXwXdJzYxh__94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzxjQ24lbIPGQP2-xN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugymp8M2E30o2xZveLZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgynlRSEMv8TG3-nEIR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgykCsIQY6oFVShz9jR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]