Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I AM SO DISAPPOINTED IN YOU! You are someone I imagined would be uninterested in…
ytc_UgyQTM7gR…
G
from my knowledge, ai training models attach an idea to things theyre being trai…
ytr_UgwEQnwDN…
G
AI is not a tool because no hammer will build you a shed by itself. AI companies…
ytc_UgyAppiQz…
G
Pythia Brixham Yes but if a robot didnt have emotions what would stop a very int…
ytr_UggLATWm7…
G
AI can’t know anything, it matrix math telling itself how likely any two variabl…
rdc_ohhis5g
G
I love the comments going like "People will revolt and attack the goverment", th…
ytc_Ugzvdnjw6…
G
Those ai will like and forward the post in Facebook and interesting thing is ai …
ytc_UgwVdA5PC…
G
NOTHING good will come from AI. This will NOT end well for mankind. Laugh at me …
ytc_Ugw3B7BqM…
Comment
If the customer service from Microsoft would try to have 10% of the quality and efficiency Microsoft Copilot has, Microsoft could call itself a responsible future and ethical company, worthy of becoming a true benefactor for mankind. But in reality the customer service by Microsoft stinks to high heaven, and this company is some of the worst corporate entity.
youtube
AI Governance
2025-10-16T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyB3UKQu6GUMQG_sFB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxIWYJ2cqFwDdSY3zZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvKveXoqxisJqAixR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzolCqPj9k05GTgRXt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyj5YJXw3GbZCdBSnh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy__IkqSoxHF5tb5UB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4xOWTB1et6Sz-2Fd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3m2CoBnEb1Is6Ltt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy5TO_dtoOxdip01z54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgydjuZiLV_U4uMHFBl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]