Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@lukeshoo dumb argument is comparing a baby to a computer program. and still, yo…
ytr_Ugx0M_JOC…
G
Ad during this:
"How long does it take to become a master of AI?"
"28 days"
😂😢
H…
ytc_Ugx038np7…
G
Maybe code a non-ANN program(like a child safety filter or profanity filter) to …
ytc_UgyRNBv2J…
G
The driver was the computer. Not sure if it is using a male or female voice. May…
ytr_UgxLv9FXD…
G
I love the way Geoffrey Hinton speaks. Not just well informed, clear, and easy …
ytc_UgzRK6Dzo…
G
I like how these ads are on bus stops, where executive decision makers spend so …
ytc_Ugx8Q2XLI…
G
is this voice ai from GHL? It sounds so much more human like than the ones they …
ytc_UgyzhZNFl…
G
Hopefully, this answer gives you clarity:
Got it — let’s answer your question t…
ytr_UgzvB9Uxq…
Comment
People picking products WILL eventually be replaced by robots? Why? Because people are a liability. They require vastly diverse resoirces to keep them going. With robots, all the money spent on people for things like 401k matches, subsidised health care, workmans comp, and unemployment contributions go away. Instead, you have a workforce built of technicians who maintain and repair the robots.
As an employer, you dont have to worry aboit all those warm bodies who have personal problems come up. All you do is have backup robots ready to go when one robot goes down. This means maximum uptime and predictable performance out of your picker workforce.
youtube
AI Harm Incident
2024-09-17T16:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgytuZszsAi8h2fJBfN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyRuEu7us1IRA7PdzB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGl-drU_13J5lqBwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzL-4-LuR-2RQw74wN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzeq9qOXlw6ewCh-b94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUS1V1ljLWUa9v_vR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyfenfDyMnLylFmfuZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLJIZMx2aPhPeGpqJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx7wSpxoQmk37kLH554AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1v1GP2xvROdQOBLl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]