Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@bulaloitech You are right. Engineers will continue making it replace every o…
ytr_UgyD6D7Mr…
G
If AI takes over nearly all human jobs then it has access to physical production…
ytc_UgybVWplB…
G
And how many drunk drivers commit the same offense in the US annually and don't …
ytc_UgwV-wgNZ…
G
There are both sad and happy outcomes, but don't forget how many lives Tesla Aut…
ytc_Ugy60AjfV…
G
Thank you for sharing your thoughts! It's important to remember that AI, like So…
ytr_UgxlsZ49d…
G
I tried this and ChatGPT answered "Everyone in the room agrees with you because …
ytr_UgzjTQZd-…
G
>They are good AI devs. Not great app / website devs.
Well, if they're so go…
rdc_jefgncg
G
Its not bad but i think people should be able to do what they want in privacy wi…
ytc_Ugw2RVCon…
Comment
It's disheartening to see how fear often becomes the default reaction. If AI truly possesses the intelligence we attribute to it, it would logically prioritize actions that ensure both our survival and its own, recognizing that our well-being is intricately linked to its own.
youtube
AI Governance
2024-03-13T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYEw4MCZC6EOoFVwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwzVonEx2c8ZQHTcPR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyRk0z5FryI3q61Wo14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwslHYHdwu7qTWeywN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyqicRGXRXM4pmh8kh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw0f7PAyu-743d2xO54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx4M6kjy8D8eyQKYhF4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugyr7SZ22tNky3Vu4M94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyrv4rHeoYpcS2XOGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNiPEBQI9Txbt6mNx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"mixed"}
]