Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The moral thing to do is pimp out the AI with an affiliate link so your client c…
ytr_Ugw_-opgy…
G
AI has now become one of the most hated things on the internet, but because it h…
ytc_Ugy-dIMZV…
G
Tldr: Generative AI isn't Hollywood AI and I wish people would stop thinking it …
ytc_Ugw6YTviQ…
G
oh yes, if we’ve been AI entirely, then we won’t have Google or YouTube or any o…
ytr_Ugxyl9cqb…
G
I really love your nuance and honestly refreshing view of AI. Most artists nowad…
ytc_Ugwc1bqN1…
G
I'm the only one that his angry that the people that did nothing about private i…
ytc_UgyfmaZJZ…
G
2nd robot: oops
1st robot: the box and you messed it up
Person: chill
1st robot:…
ytc_Ugx2ZkHut…
G
for those who confuses you: ChatGPT and other Language Models don’t have emotion…
ytc_UgxZz4NVI…
Comment
Considering how the US government works, do you really think there will be any regulations?
Corporations have too much influence on our government. There are too many examples of companies putting dangerous products out on the market. Yes, eventually they're removed but usually after there are a few or more fatalities.
Companies want higher profit margins and paying people is a major cost.
They're going to want AI and robots as fast as the tech becomes usable.
Also, executives are not safe either. Low to middle-level managers will be at risk.
Execs higher up on the command structure will see them as an expense too.
Yeah, we can laugh at AI now but it will improve.
youtube
AI Jobs
2023-06-12T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw3mSi45sbuuEHocD54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwi42BPLfqbTR-_BdN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymhTCDhUpV1Ituso54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxyrF7RygQqRNlpxaF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxQ4Oipx_thQ0zNUzN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwhj8ym5FOcaihTpEh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxAafv6RAvdU4t3JeN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyVf65RV6W0gRkSK4N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1r94E4os03uyYvQV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwUB7BMwrwPfMrCBb54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]