Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
HE IS SPEAKING FACTS, A.I IS ONE OF THE MOST DANGEROUS THING ON THIS PLANET WE C…
ytc_Ugy8vggn0…
G
ChatGPT can't even count to 200, without making mistakes, and human beings belie…
ytc_UgwPzQaPs…
G
"*Big Tech fuels 'growth' with crime*" is a bit overdramatic.
Google deliberat…
rdc_mss2ucr
G
lol dont force AI to believe a lie, says the guy who supports isreal and believe…
ytc_Ugzh4EZz0…
G
I also think its really really sad that these 'ai bros' rob themself of so much …
ytc_Ugw-UuVrK…
G
Well, Elon Musk said with AI none will have to work, and everyone is going to li…
ytr_UgwUj27S9…
G
Robot 1: tzztzzzttttttzzzz....Failure.. tzz... my system mmm.... Stewart shot th…
ytc_UgwdW8K6W…
G
I feel as if our destruction is inevitable… We are not smart enough to get along…
ytc_UgzWjHmca…
Comment
In the short term, humans may be involved in trucking, but there is ZERO chance that in the long run, they will be. So this is not really an issue of if, but when. And since we know that, truckers have time to figure out what they are going to do. Were I a trucker, I would see what the cost would be to automate my truck, and then have it haul for me. Same thing with robot workers. The second they are humanoid and reliable, I will buy one and have it do jobs on my behalf. Change is inevitable, and if we have the chance to see where it is going, then it is on us to figure out alternative means of survival. Why? Because there is no guarantee of work. That's not part of the American deal.
youtube
AI Jobs
2025-05-28T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzNv5ndayJIZjzF_5F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTp54_47powBJYodh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4rPLN-dzmTZMsIdF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyARp-95P-15VPBIZ54AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzkHB_7bmPO7vdcgmx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7glTXBkI9gt3VhOF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwm1PthyYVBN3I0CQJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxvspEYx1EGs2vmbPd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyoasHandEeYxoxt8l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_4-C843GLS_e0_Sh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]