Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
when I'm feeling slightly depressed: "I think, People are toast. AI is gonna tak…
ytc_UgxMaAxsO…
G
@EpicestGamer2I agree with you but passion isn’t finding the easy way out or ta…
ytr_UgxDOFs-e…
G
why this guy talks and dresses like a fucking clown in a conference about AI…
ytc_Ugx2qgf3-…
G
There will be no future the end of humanity is not far away the genie is out of …
ytr_UgxDO6vc8…
G
Yes human biases do become part of the technology we create especially those of …
ytc_UgzzKnLCP…
G
Your perspective is definitely thought-provoking! The dialogue highlights a crit…
ytr_Ugxq8ue2R…
G
Even worst...these lonely people fall in love with their AI partner, giving them…
ytc_UgzPqsh2p…
G
These guys are living in a tech bubble. The vast majority of people don’t want t…
ytc_UgwE8jlLy…
Comment
Literally every 2nd year engineering student will tell you, you always need redundant systems for critical systems.
Edit: I'll spell it out for all the bots, fan boys, and Elon D riders: you can wipe your eyes, you have ears and other senses, you have the common sense to not drive in really bad conditions, and believe it or not your eyes are much more advanced than the $5 web cams they're putting in Teslas. Additionally, to reach level 4/5, the vehicle must be infinitly more reliable than a human, not on par with, and the issue of liability still hasn't been addressed. But next year bro. Next year, we're gonna have robot taxis on mars.
youtube
2026-04-08T15:3…
♥ 344
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugy9tdBbB9hrFjKYKmV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlKLZNeTJmLUULum54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyoEaRVuI0jq5zye7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2ZUtG539KTnT_nrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyACA_tP1esiLUe6mx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw28IIUnhqMgH8l6Jd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5piqVZRMOW_40pWl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy_6aFmUbFEvAQ33up4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyACopbj6pnHeKvo5l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzlEYEuPjH6G9ATpcl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}]