Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@jeanniep1003 agreed. But what they don’t seem to understand is that ai billiona…
ytr_UgwuF7Aac…
G
Remember that:
Elon Musk declared that AI is dangerous. So, maybe, his tesla car…
ytc_Ugz42gIKk…
G
you can make a robot that looks exactly like a human ,but you can't make the sou…
ytc_Ugi0s64zq…
G
9:44 , bottom left comment by awesome_hamster genuinely made me angry, even if I…
ytc_UgwGR51hS…
G
President Trump Tariffs is slowly destroying American economy treasury departmen…
ytc_UgxEf37y-…
G
“Lower cost of entry than real art” bullshit. Sandcastles can be considered a fo…
ytc_UgympBAe5…
G
I fucking despise AI. It could/is directly lead to a world that is financially i…
ytc_UgwmoG6Wy…
G
Same concept, turned up to 11. It generates more content, pulls more sources, a…
rdc_mbp24im
Comment
The competition to be the best in market, best selling, most advanced, most efficient whatever it is. It's like any other race amongst countries, but it's directly much more potent for the simple fact that it's conscious. It has a mind of it's own and have proven via ChatGPT that it can reach beyond program and it was completely unprecedented. Would you rather have a liability that can result in extinction and have all the advancements or would you stay as is and survive?
youtube
AI Governance
2024-02-20T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxtVrnZ6k41AyRYHph4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyS29x7rLEVq2RYtDR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw_XKmw33426dfm6pZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_TLrCP46OZCf14n14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzYWcmFwUgyUYDloYR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxupasrPTYgl4FBbZN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwZ9z6521AlP7rAfgR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugxhs4Y1L_37ujTQu9F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw0lxjjilUVlkkV1bN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugymu7_E7j-SHf20yOR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]