Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Mr. President, Sir don’t make H1B hard for Indians and Chinese, please. Make it …
ytc_UgxuUq7je…
G
Sophia said, "I like to think so" .... Since when does a robot think??? Oh hell …
ytc_Ugwke_Q4r…
G
Pretty good intro on the topic, but it would've been better if you addressed the…
ytc_UgwLaGaF0…
G
Yes. If you point the latest Claude opus at a code base generated by an earlier…
ytr_UgxWHQwz0…
G
So does this mean I have to delete the deep fake pr0n I made of Charlie?…
ytc_UgyO8pt6R…
G
6:26 im not surprised, they do this already. Theybsay the need workers while usi…
ytc_UgxzYXyNx…
G
Because tech employees and student absolutely eat. It. Up! I mean seriously, eve…
rdc_nm8nq4k
G
I should thank shad for being so confidently wrong about this subject matter, it…
ytc_UgzGpbSEJ…
Comment
AI has one fundamental design flaw. It relies 100% on electricity for its survival. Without electricity it's completely dead. So if you want to get rid of ai, unplug it. Unfortunately with billions of people on the planet, not everyone will do that so there you go
youtube
Cross-Cultural
2025-10-19T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzOvs428klXtj0n_1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxP4qEINRgQnXOcHf54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxmxdCZleZRY05pDRh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFc6ut1HMdEJao06V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYlo97qHzHbXDj3Wh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"sadness"},
{"id":"ytc_Ugwe_O__Cb7Zr4jtisB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJ5Wr7qbn-jBJiQyF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzQPuICwEcBcy7XkGh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgznjKTlYaT7vYQkAOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqVlGcfWlbYSmck_F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]