Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It does seem like he has read or seen too many Marvel movies/comics doesn't it. …
ytr_UgyzG9twp…
G
why is no one talking about the advantages of autonomous super weapons?? I mean,…
ytc_UgxqWD5ft…
G
we didn't consent to face recognition vans taking facial fotos in order to use t…
ytc_Ugy5-L_M0…
G
No one is answering the question: How do corporations expect to make money if th…
ytc_UgxagBHhD…
G
What are we gonna do if robots ask for their rights? UNPLUG THE FUCKER! We have …
ytc_UgjTGWk9P…
G
You have no clue how powerful AI really is. Many jobs are and will be replaced w…
ytc_UgwxWInKF…
G
There are 2 futures with ai robots, 1: Terminator. 2: Congress will enact a law …
ytc_Ugx0YGdmf…
G
@WZspacecowboy7 Yes, we should not be ignorant of the devil's devices-2 Corinth…
ytr_UgwalGVrQ…
Comment
I still haven’t heard any of these ai experts describe in detail how this ‘takeover’ actually occurs. I completely appreciate how jobs are already being made redundant by ai but there’s a big leap to tech ai giants ruling the world. Like what actually happens? So, let’s say a company asks for a plumbing job to be done on their factory and they call a local plumbing company and they send a robot to do the work. Why would Elon or Sam own this robot? Don’t they sell robots to the end user/other companies? Why would Elon be running plumbing companies and make all the money from them? Large agricultural companies would simply automate more and more and the possibly use humanoid robots Elon or Sam won’t own all the agricultural companies. Maybe I just don’t get it
youtube
AI Governance
2025-12-05T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUxRpcW3m4Oa8MQOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNgHvetyJmP3wNpPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMDC8m8jDdWEVovjR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4lg1qUiS4XAVXo-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRG72du5S7mL9FC2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQVBtuL3R9eNXG3yt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyq8p95FKR0z5RJl5d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxnrwHG1jZaYPrmbth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdTgHylS6wkMQUOsd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyK9fPBALTcFds3HAR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]