Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Flatbed drivers secure their own load. I don't see how AI is going to eliminate …
ytc_UgyJZusz5…
G
@thewannabecritic7490
If it is a hobby, why do people pay for artwork? Cause ar…
ytr_Ugxm0d_Tm…
G
I like how the title was changed from "tesla autopilot kills motorcyclists" beca…
ytc_Ugx51pLGE…
G
And the future is even more horrifying. Area denial autonomous weapons the size …
rdc_ohut20a
G
Why not have each employee have a robot. One robot for each employee.
Kinda like…
ytc_UgwEzwKyh…
G
I'm at 4:00, and it sounds like they will be talking about the infinite paper cl…
ytc_UgwEDAY-B…
G
Is it really that surprising when the ratings are measured "per capita" so highe…
rdc_da40cmf
G
lmao my god... even thinking of this as "new tech" is a joke. Oh yeah once upon …
ytc_UgzhEK8iz…
Comment
Andrea's comparison of AI to nuclear weapons is missing one key difference; it took the US government years and a significant amount of manufacturing output to develop the first two nukes, and they compartmentalised everything so that no one person had complete knowledge of how they were built. With AI it's not the government developing it in secret, it's huge corporations locked in a public race to be first - and there are no safeguards. Oppenheimer had the option of shutting the Manhattan Project down if the risk of atmospheric ignition was deemed too great. But these AI companies are ploughing ahead regardless of the warnings that many of their own staff and even some of their founders are now voicing. There's no putting this genie back in the bottle.
youtube
AI Jobs
2026-02-17T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzTZo4yTPRCyk5SHf94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFjpknWHdSEuAGD4V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwt2tYbuRRE96K9TkB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_5X-HWjWmqtt6JS14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6pyy3RCtkPlutvHR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdpizzBoaI0qsA6Zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyr8ZOsDir9YBY1RDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz--JZr0kYkDOTJ4kN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybiY0pLiRJnnbTWBx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVAulQj8fFiTwH5GB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]