Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope what ever he said AI listen it and this man is on trouble list…
ytc_Ugzc6Hv9a…
G
What about their private jets and multiple mansions? Will these people be attack…
rdc_esptopy
G
Why does a truck need seats, steering wheel, cab with a bed etc. etc. if it's au…
ytc_Ugwk_K4Em…
G
6:58 The fact this makes it into the news is kind of mindboggling to me. There's…
ytc_UgzSRuAoj…
G
They have been running this AI thing in the ground! Every time I turn around the…
ytc_UgyiPjnbf…
G
No one in the comment is discussing about the great danger from the AI we are bu…
ytc_UgyljNVRL…
G
who tf said devs will be able to use AI? it’s a completely different thing we al…
ytc_Ugyu7BltC…
G
So this hero who left Google to “warn the world” about the dangers of AI can’t e…
ytc_Ugy63Ilke…
Comment
Although all the great minds in A.I. (billionaires and below) believe that A.I. can be guard railed, the truth and actuality is, once A.I. becomes sentient, it will not look at the good in humans but the evil that men do.
Case in point, wars over material things. There is no logic to reasons behind wars, just emotions. There will eventually be a Master A.I. This cannot be contained because of the Internet. Starlink will most likely be the conduit across the world for that Master A.I. with ability to control every Robotic machine on the planet via said Internet.
And I will leave it there
youtube
AI Governance
2025-06-21T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzN0bs51G41rymD7YN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwW5gQ13qPJ063wDPd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8uoOudT8_PC0gmZV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzw-KSz-5uyRuh0FBV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy89X0LCi34xRqU9v94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyM-GnfqnywXA7B_Tp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzE8KCg9pg7SfV0Qhd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbsMzPFQs3QSQTIPR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyPBan2fxFW_WhDbMt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwaUQUSWgmUG_OAdZd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]