Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I could imagine a human passenger as a security guard who also performs some sor…
ytc_UggH6hFA_…
G
Except we can emulate empathy with emotional intelligence AI. Most people prefer…
ytr_Ugzr0qOcL…
G
both the arguments for and against ai art are stupid, let people do what they wa…
ytc_UgzidTw6F…
G
The good news is that AI won’t impact Countries like Russia, North Korea, Vietna…
ytc_UgyrdEDP9…
G
Or it could just be a bubble, so LLM are the most successful of the AI models.
F…
ytr_Ugz3DERDz…
G
Self driving cars, It won't work with pedestrians, maybe even other drivers, be…
ytr_UgwgJr6fs…
G
If ur device starts acting very weird example: tv shows broadcasting scary thing…
ytc_UgyMqszBX…
G
The reason why so many major corporations have risked billions of dollars in AI …
ytc_UgziMLbe_…
Comment
I don’t know why everyone keeps letting these start up geeks sell these types of ideas. First self check outs, then robots making pizzas, robots in factories, A.I., it will not stop until these a holes take all of our jobs. They are directly coming for our livelihoods, and trying to destroy everyone’s way of life.
Don’t fall for the idea that they are so smart and innovative, they just know who has the money and they know how to smooth talk their way into getting it. Tech companies are constantly going out of business for stupid ideas, but they always take down a lot of workers and walk away with a lot of money Fn bundles of sticks!
youtube
AI Jobs
2025-05-29T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgybdNGFMHBSHimm81R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugy3XlVjwZArMx7Ki814AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgwwJMUlZAsRPUuCwol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugy5_GxDZxzHJ--rZ2J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwzFBefKdNjM8j4zJJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzt17qkkHg3dvma4XB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxVS9FCaJj8MOlr3zd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxKuVXI_aCCiGrK6454AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugy3BLIkZLpdpQOjiN54AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"mixed"},{"id":"ytc_Ugz6chsffWG1St6dWVB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}]