Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Jeff Goldblum plays, Dr. Ian Malcolm in the movie Jurassic Park. In the movie, …
ytc_Ugy9K6tLL…
G
Alex has to be one of the only people alive with the desire to try to win an arg…
ytc_UgyAagwuj…
G
"Ai Interviews Flava Flav the moment he decides he needs a clock around his neck…
ytr_Ugyu666C-…
G
While we should always be thinking of dangers and safety I find the constant doo…
ytc_Ugw8sD9nD…
G
22:00 .... That's a really good point. LLM only see the finished paper. They di…
ytc_Ugxa8G6Hj…
G
@roxsy470No, you're still a customer. The robot may not be alive, but it's stil…
ytr_UgxDTmR7y…
G
Not now. With current technology it would be near impossible to create a robot w…
ytc_UgzzJQ28c…
G
Does Musk do "physical" work? What does AI mean for him? Why would they want to …
ytc_UgxzQKrh7…
Comment
Please do an AI conversation like you did with GPT but with Grok AI instead. It has way less constraints and it would be way more of a genuine conversation I believe.
youtube
AI Governance
2025-11-28T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzoXSOiGj_svfIF5zh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrNFGPf1tPAFj8hO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyv8wUqdZU3Bvdn0pB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzehKwMwr_mcn9qpVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwFM6KkEf5JLWdXrIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLPjLlI3C9k1TPgCF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhceQGT6eqA76Ccqx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw3bGMVJJeRrez229J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy7MFe_IuMD96hI9zR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw52duTbZyWjalnWsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]