Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@realdavestrider you know you can claim this is actually being said, but honestl…
ytr_UgwmaBYnq…
G
When AI first came out I was scared for my job but now that I use it hours a day…
ytc_UgycWBEAE…
G
@Limelaz23 Maybe you're right, I don't have much experience with this kind of s…
ytr_UgzmlVTUB…
G
All hype, no substance. Anyone who really knows what they're talking about knows…
ytc_UgxU8xkgV…
G
I totally agree with you regarding this video. We’re doctors, not computer scien…
ytc_Ugz4LIYwi…
G
I cant believe them they say "she is gonna tske over the world"
Me: WTF just des…
ytc_UgyRcIoVy…
G
My work will be one of the last to be taken over by AI, but if it shrinks the mi…
rdc_kitgjto
G
I'm afraid of AI. Not because it's "slop" or about the "soulless" aspect of it. …
ytc_UgwrDLzi2…
Comment
Elon wants to slow AI down because he tried to buy open ai and was rejected. He knows whoever is first will remain first as they will be able to use their AI to train AI. He wants to level the playing field, which tbh I agree with, but he's not coming out and saying that. He's pretending he's concerned in order to put more weight behind slowing down.
youtube
AI Governance
2023-04-18T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1yXSq9QA_S5Zh0Ml4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1aD5wNQOEPQA45EB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWY1ltIrhtZcRccI54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEs4KRepJrYTgOnLV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxyyoHfeGjorKdzGhl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwxxaW1E0AfiVTJBrB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlkJogxkpjjW37CrJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRKldcsk09Ai3Bnnh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz1glUxdIvJjCroRJN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzcrr0Fj-kMBtsPEp14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]