Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I say we already been this advanced b4 .. and its only matter of time b4 that re…
ytc_UgytNpMN6…
G
Several regulation should be implemented.
Things like:
No single AI should have…
ytc_UgwH3io5Z…
G
In other news...
- Britons get exactly what they asked for (majority).
- Britons…
rdc_fwi1fgs
G
I saw one of those driverless trucks on a cross country highway on a road trip o…
ytc_Ugy6BrXKV…
G
In Robot mind there is ne sensitivity for that robot have no recoil control.
Fi…
ytc_Ugz-57I_x…
G
Every artist should opt out/strike imo. IE not upload publically and we'll see h…
ytc_UgwOqfwiZ…
G
@bleachedout805 I look forward to the new $20 per image model AI companies want …
ytr_Ugzj9eBe7…
G
there's a scifi short novel from 40ish years ago about autonomous killer drones …
ytc_Ugx_fUG2G…
Comment
lol I’m not surprised. They already eliminated test engineers and pushed it off to the buyer as “early access”. Since ai is based on all code read in, not just good code, of course it’s going to be crappy. lol! I’m sorry your billions aren’t enough big companies. Maybe grow some empathy and actually support your employees.
youtube
AI Jobs
2026-02-05T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgytiD7CayJjti5FPvd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9PjlnoXNryw1JGQp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgysDPHBq0b4ktpQc7d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyka_gGKWP2PJA0CJd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyfmVeShwNijbg8zF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwjuR66Ap5q4svaWnp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzU6pufVsymIeVsa7d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwjanbj0dwFEsg1OdZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFIpPjmxb4-Te9fQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWw4VSHwe0CwA8-il4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]