Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is an uphill battle , companies can employ people for creating training dat…
ytc_Ugy8GglHY…
G
1. Yes, darker people absorb more light and this DOES skew results - but this fl…
ytr_Ugy2pvyi7…
G
heres the thing tho
robots cant feel pain they have no nerves they dont require …
ytc_Ughv2wJAg…
G
The tool itself is morally neutral... it's what we bring to it that matters. Do …
ytc_UgzENawfs…
G
AI can only be as good as the best artists it was trained on and it doesn't prod…
ytr_UgximU3Q_…
G
I appreciate your perspective! It's true that AI models like Sophia are designed…
ytr_UgzuX0nlS…
G
What I really dislike about AI (besides the "garbage in, garbage out" and wild h…
ytc_UgzdiuAdB…
G
So Ai's will need to get bored being on support calls or it will cost the compan…
ytc_UgzJWTOQw…
Comment
Hypocritical and immoral behaviour in the highest level from him. "I didn't know what I was creating" part that Terminator movie came up 1984- 41 years ago and it was good till He made money but now he is 77 and will be bashing on Elon Musk, how people always thought are special through history, blaming economical system, people believes and everything and everyone else part himself and others like him. The best part, He is going around telling what a mess he and others like him created , how dangerous it is, or how AI is going disrupt or social, work environment or even destroy humanity in total. The worst part is that the Pandora box is opened and is no way back, so it is a completely useless conversation.
youtube
AI Governance
2025-07-04T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwj4LxmCA-k9ZHHvJV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyuWKE7mgR-BO31eet4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHYIH3d46O2KWQDcp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxXWQ0F07IyY2JcBzp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzAAk1NfSe5A2bRiqt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_OgitN4NW-UTkmSl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVSuyyKbU3NP9WsT14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwvj4tNPl9uSJoFObZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5JVIdIFkIermh7XB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwbpKGqwtEUtIWSvR14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]