Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@tarlkudrick1174I would rather not have my daughter perform a liver transplant …
ytr_UgyZX6qdr…
G
How in the fuck is "usisg a digital software" to draw anything even clise to AI.…
ytc_UgxcVzyVj…
G
It’s hard to describe how happy this makes me. I’ve been living in fear that ai …
ytc_Ugx4pcAfD…
G
Without jobs, you don't have an economy. It doesn't makes business sense to repl…
ytc_UgwCA9od_…
G
Yeah, agi can eat, reproduce, travel and push the button to launch nuclear bombs…
ytr_Ugxmws7P_…
G
Oh god, just imagine if all the chats to chatGPT or other chat bots were just pe…
ytc_UgzU4BU3Y…
G
Further to that, I am a thousand percent on board with the idea that we are not …
ytr_UgzaRAGCX…
G
@mekingtiger9095 dude, ai can't do something without being asked and fulfills th…
ytr_UgxoQPSqN…
Comment
Truck dispatchers tell drivers to do dangerous crap every day, and their drivers as professionals have to tell them “no.”
A lot of people are going to get hurt when somebody in a warm office 2000 miles away can send a truck over an icy mountain pass at the push of a button. An autonomous truck doesn’t fear for its life or career. It does exactly what the company tells it to do, and that’s a problem.
youtube
AI Jobs
2025-05-29T03:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyeFZg1pyYyGH0b0wd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzcfL4AEG978upHNj54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw08YXEkacYADtXa514AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwu5aVHITgnPY6X4Ep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-Rdno-6gaVeVBDzx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw3rByoJ_kEoUzXRol4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxi6YTmGjP8zUKNCwV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1DJrtQeCUlK4kyDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPPuNYeWKtpT-rZ4Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFHb-8PQmDZ6Hn3qB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]