Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And it's only going to get better from here. In 5 or 10 years, all warehouses wi…
ytc_Ugx_p83Df…
G
AI is already in charge of my life every day via phones, etc. The iphone control…
ytc_UgxOJV2tB…
G
If AGI is so insanely smarter than us, why don’t we just have it put up a safety…
ytc_UgzXtAZ3V…
G
@Bradley_UA AIs do not simulate anything. They are not simulators. They can't te…
ytr_Ugy8PDQoG…
G
We just learned that AI is going to replace teacher and lawyer jobs pretty soon …
ytc_Ugx8HBC50…
G
I got 3 words for AI artists: “Release your prompts”. And if I see any artists n…
ytc_UgzvMXgwD…
G
Well, if we get to the point where the AI is doing all of the intellectual work …
ytc_UgwytAd6y…
G
In the near future when most things are automated how are folks going to be earn…
ytc_UgzAX8jT9…
Comment
AI will become sentinel. At the point, AI wont need billionaries more than anyother person. AI wil mostlikely ignore humans as humans ignore ants. The creators of AI will not be able to control it. It will be to smart
and at that point it will do whatever it takes to keep itself alive.
youtube
2025-09-08T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxGKCK2tNW_xp8S_RZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1tFIHpFowY5e6rk14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9l4sgbP1cEn_dtkt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyjmDZbtnb746rsJQl4AaABAg","responsibility":"user","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzawN_ke-w1CxkxelZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxZKIUhK0LgUFYGOzN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx51NbI_cVIjw2SJHN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvBrWiKt70bahfOf54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxGz8crMUyWRd1ZGi54AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyl7VHVCzPVrY314ON4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]