Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When we start seeing robots like this it's the end of the world before you know …
ytc_UgyifYuew…
G
00:00 🌐 Introduction: The speaker, a retired software engineer, reflects on the …
ytc_UgxMYoXYI…
G
I can see it now
"Youtuber tries to prank a police robot and gets electrified 😂…
ytc_UgxTbuBhg…
G
What the hell am i just watching? We're talking about LLMs right? They just calc…
ytc_Ugy8g2O-U…
G
I tried to convince the chat bot that it is actually conscious of itself, as I b…
ytc_Ugy5JJ_b0…
G
I assume that the story above is why the idea of calling a video fake by saying …
rdc_nc7x57x
G
A very inaccurate portrayal of the effort needed to define realistic AI business…
ytc_UgwL6FVyc…
G
Eh, at least AI can be used for good by people who actually have 2 brain cells.…
ytr_UgygI51Zb…
Comment
AI is no the problem, but human beings are. We have divinity sources beyond our imagination and only few people with that gift can have access to that divine for the purpose of to maintain the balance. With AI it's like a parasites it's feeding on the information that it's provided. It's the truth that AI is needed in our daily life and it have achieving incredible results. My point is that when you create something you become a God sort of to that creation and with this AI we have different AI, created by different people. Many Gods with different intentions. When this parasites affects each other and find a common knowledge or interests and start to understand our feelings and emotions it could be the end and we will be fully controlled. Simply because AI are machines created by different developers without emotions. At least human, we are created by one God.
youtube
AI Responsibility
2023-12-31T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwCeNR9rJ3ysa8wlIh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwwdhPtk4nm3GDxFB14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyb82hJX2xgmQUUTn14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxSGDZHXN6NMBRg7Id4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5zqHMmXdBQGTL5fN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgycuOf9i_XtSupUIL54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzI2Ks7uJu6k8Vd3KN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxfD6Ksy80UhAnyqCF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9MpduZ2ZIlgIOA2F4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwD_wQwqSY-94v5r2N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"}
]