Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dude : "Now give me back my gun"
Robot : "give me your clothes your boots and yo…
ytc_UgxlfPcW6…
G
Look into who was his competitors and what tech he was bringing to the table....…
ytc_UgzAKAjnp…
G
@MRS_Plays I mean, drawings don't really have any value to anyone too. LLMs are…
ytr_UgxF__2kj…
G
For coders to be disappeared, customers have to say **EXACTLY** what they want t…
ytc_Ugy9c6Oyo…
G
Cool video, and as many wrote, I forgot that I turned on the video about self-dr…
ytc_Ugx5UZ8sv…
G
I still hate the idea that people who use AI call themselves "artists". They're …
ytc_Ugx0HaStI…
G
I don't about this. Yes AI could end the human race. But scientist also thought …
ytc_UgjIsz0jc…
G
She pretty much said it hasn't she: the benefits of having algorithms vs humans…
ytc_UgwaeNZ00…
Comment
Is it really possible to stop any maniacs trying to do great evil and severe damage through the use any high tech (AI included) ? Until one day somebody could effectively and precisely control any maniacs and gangsters (whether human or otherwise) from committing evil, the likelihood is that all shall eventually be doomed, even if slowly and painfully. So the key is to shape and steer human thinking to lead towards good instead of evil, collectively, not keep producing potentially lethal tools while consistently ignoring what anybody would use them for.
youtube
AI Governance
2026-03-11T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwXa9a7d8-whjS4hGF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzG5DuRj9ommFuXxA14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdcBHoHpgOTbXJZAh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxd3AsmvxSTtk976E54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7usZPzgsnVlVenoh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7V3HDJc2fieWcwpJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTawOf9Y_hqnX6A3V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlGoxIFx1XSqQldYx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpQeYvVGQZmps25UN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSOPGd-lAhPn0pThZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]