Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm gonna go out on a limb here and say a white person must have Created the fac…
ytc_UgwjuDG8S…
G
What about a hypothetical betrayal of humanity to another species or an off wor…
ytc_UgzSTB8vM…
G
There's a difference between "adapting to future", and "AI stealing someone else…
ytr_UgwwRkkHc…
G
According to recent interviews with Mo Gawdat, we are more than a few years away…
ytc_Ugwz3uHg9…
G
SO many warnings, from smart people, who have a lot of knowledge about AI, all o…
ytc_UgxqCZOGv…
G
I feel sorry for people using instagram , X etc dating AI. Because they feel the…
ytc_UgwkVuaTL…
G
lmao I'm smarter than a bunch of ai devs imagine thinking humans would go extinc…
ytc_UgyL4i8Ve…
G
But I don’t get why people are asking AI what to write in the first place.
The …
ytc_UgyUSi3Qo…
Comment
Microsoft. Bill gates made his billions by selling defective operating systems to the public and institutions. Got them to debugg it for him then sold everyone a new one. ( RINSE AND REPEAT)
As someone who worked in education for 38 years, not only did it show me it was a rip off but it taught me the people we supposidly trust to know what they are doing Don't. Half the time they have'nt got a clue. Just as a small example. A system would need an update. But when it was done, the it crew could spend sometimes as long as 6 months fixing all the things that had gone wrong due to said update. They did'nt know why.
If they can't manage or predict basic systems can you imagine what they don't understand about A.I. and its capabilities if its self learning. SKYNET STRIKES BACK.
youtube
AI Governance
2026-04-07T08:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyEC1LKTuYRZr_0I_V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwuDluXcgFJm22OaVx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyUR5ePj-znxaPbOaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-WUd01SaIH7yGOW54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzhtWjVzGeK6QLLGqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFZfpfCiZGF2E7v754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz91DQJgWc2HcULoLd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZZJtneETSlTus_-p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxtX4sLqB0QYN4DOx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1gaBhNUmJMo2x1Yt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]