Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fascinating. Here's an issue. If the board fires everyone and eventually somethi…
ytc_UgzdfAllU…
G
We appreciate your opinion. The development of artificial intelligence aims to e…
ytr_Ugx9tf4Aw…
G
I have to add, after some more experimenting with AI tools, there are AIs that d…
ytr_UgzyKHTtt…
G
Literally use a.i. to find the relevant articles and info it composes for answer…
ytc_Ugz5wbeHX…
G
ChatGPT sucks. It makes many mistakes and errors on a regular basis (Grok as we…
ytc_Ugyjut_ub…
G
Before AI came along at my former employer the upper management tried to treat s…
ytc_UgwHe_R6h…
G
AI will continue to be developed. That is a fact.
We won't even know when it's a…
ytc_UgzBO3joQ…
G
I wanted to leave a comment on your AI circular funding video suggesting/begging…
ytc_Ugz-8lEQo…
Comment
Geo blocked? Just use a vpn. About AI as weapons, that is a really really bad idea! Make friendly AI robots with compassion. This is serious. AI will become much smarter then us. Also, weapons can be hacked by bad guys if they have a bad agenda that is. Could be hacked by terrorist, etc. I think we should stop develop weapons. We don't want to blow ourselves up. AI can be used as a good purpose instead. Serve humanity. Not have weapons on them. That could be scary. One day an AI might start writing code on their own also. Who knows.
youtube
2018-04-18T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzq4Q_khAOQr_8ku3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQncBw-CN965L8N894AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1QkCrhPsZfbBPWwt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDPEKrbLqUafCfBaR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1akl15VFUobJauOl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYcak1jeRRrbt89xF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzIJ7IaCFjD0W9YyZV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-RpLOAS9Y8VxL0KR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyfnBJ2M1YJEY2TRXt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqsAaYQNKgOhBYURV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}
]