Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I drive a Tesla and i completely trust the self driving it’s definitely better t…
ytc_UgwRN5knI…
G
Almost every single word and insert in that video is fabricated as propaganda al…
ytc_Ugy1wRXX7…
G
AI has done nothing but make society go from really bad to MUCH WORSE 😂 AI was t…
ytc_Ugxfu10Tl…
G
The M3GAN 2.0 movie deals with AI killing us and being autonomous. Hollywood wil…
ytc_UgzB08jgF…
G
Why did we train AI on the internet when that's literally the place humans act t…
ytc_Ugwd5Ok01…
G
i guess they will come out with a ai that will shop !!!!!!!!!! who even knows...…
ytc_UgxmhIT4Z…
G
Oh, i can't wait until they replace news host as well with A.I. At the end of th…
ytc_UgwfqdTV0…
G
Problem is once anyone can create an AI system its almost a guarantee someone wi…
ytc_Ugw-qr0Ax…
Comment
The umbrella risk of any technological advancement can be reflected by the example of giving a great tool to a toddler. Most likely will harm themselves and/or others.
Our society, culture and in general our way of being, thinking and existing is primitive, is a toddler compared to the capacity that these tools have.
If we don't advance our structures they won't be able to utilise in a helpful way. Even more in a way that would allow us to harvest its true power and change/restore life on earth
If you would give a phone to a neadhertal they would try to use it either to break or kill sth. We are the neadhertal and AI is the smartphone
youtube
AI Governance
2025-06-25T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy1P_s64nxNuoQlO6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvAt9XKA8-kcQCe1d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZVd91N5xtdPErOz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-U-9wKe-l4qHZQud4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgywDRPC6DBfiIdzho54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwfBFqQe2sV-q1kva94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8Ps-fTu_wUQm45Tl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwtVXv97glMJNcRvWt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzRU_E1nTltAUCqBz94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxBEu_-7h0G9GXjwY94AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"}
]