Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Shareholders are so dumb! Pray their investments get totally wiped out in the u…
ytc_UgxY8abnH…
G
I don’t get you guys, one segment you say AI ahas no use case outside of P0RN an…
ytc_Ugzafn1pE…
G
@GorillaWithAPhone AI can’t have racism it can only have data if data says black…
ytr_UgyBUUWLl…
G
Did you watch the video at all? AI is racist because it's using data from humans…
ytr_Ugwhhxep-…
G
Got into an argument...
AI bros says AI art is free and accessible but datacente…
ytc_UgxpYbYD7…
G
Company need to realize that AI is recycling the information that they gave you …
ytc_Ugz-BR4IM…
G
Well assuming the cause is biased data sets, racism has been deep rooted for so …
ytc_UgzyMTNQJ…
G
Really? AI built by racist and sexist people are now racist and sexist too? Shoc…
ytc_UgwIW655h…
Comment
@27:00 Altman is coming across a bit political and walking around questions and responsibility… he’s being asked specific questions, and he kind of puts it back on the interviewer as not being a fan of AI, just because he’s getting asked ethical questions… @30:00 I like that they brought this question up, once again Altman avoids personal responsibility.
The idea that his job is done and that there’s no framework that needs to be put in place, there are no lines that can’t be crossed etc. is ridiculous.
As someone who hasn’t seen him speak before, I find it either naive or a deliberate act…
He seems articulate and intelligent, so it really comes across to me as this unique sense of entitlement.
youtube
2025-05-20T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxV4wE-Rd57gFL0f6d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugym7Dt9ab6PwdOBNmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTDbcZ3Q5HhSQ45px4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwK_zSJz049p_1imiV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgykzaS6Wn-K9muQxvJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxOyYCokeHOdxwdC_J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy23MsKqd34kfGGbBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyW1wdqc46WZ10DOq54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwbLb-3HP2Odz2Ki7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxldbD_DHDzCVmiBxB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]