Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I I a friend that works in a robot industry they already released robots to publ…
ytc_UgwtrBNF_…
G
You mean ubi? Yeah, they will totally be willing to give everyone free money for…
ytr_UgyaUi3ai…
G
Its terrifying how many people this has convinced, i guess a caption is all you …
ytc_UgxJGoGti…
G
Just wait until we are in full A.I communities and workplaces..
Your privacy wil…
ytc_UgyO18e2H…
G
If I wanted AI generated art. I would go to google. But if im searching for art,…
ytc_UgxMUKTM1…
G
1:40 well thats a first, Charles you a dummy, if you design a city and the build…
ytc_UgyVbTtgD…
G
"Greed has no limits" This line was for the companies who are developing ai and …
ytc_Ugzv7SWtO…
G
As someone British who lives abroad now I simply don't get why more people don't…
rdc_d7ksj2k
Comment
Who knows enough about it to regulate it ? Only the developers know where it is going, how they have developed it, what information has been fed to AI, which Politics AI has ? Great discussion, alarm bells ringing , but those in AI are not going to self regulate, they are not going to restrict their own product development and careers. It is already at a level of analysis many times faster than humans. The only regulation that may be possible is of where it can be used and what for, NOT HOW IT FUNCTIONS.
youtube
AI Governance
2023-04-20T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzthL0s5Is2nmIkR094AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeNjXjdYeCwOgECkB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvzgqDSQkGQQOQF0J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCMsVQROjuJJL0Cz94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzkpkgNT2WZOR-CMHZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzIgbF29pYS9sPbQVx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgybxRvVCoEE_dA1D1B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzQHxVCdjeaL0o8UWJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyxxAa6yUhxIZXmEcZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzrY88rC19UETnJ2Cd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]