Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*Rule #1 - Do not invent any machines more intelligent than humans are! EVEN …
ytc_UgzwNGuVb…
G
Humanity just has this habit of taking a beloved hobby and making profit off of …
ytc_UgzRNMdLV…
G
Thing is people do not understand that AI will need another decade or two to act…
ytc_UgwcIe230…
G
Ai art is good to have fun but i don't think it can be considered real art and p…
ytc_UgzEu0gk9…
G
well ai vids like this should not be allowed as it could influence people to do …
ytc_UgzXh0apl…
G
well wait. if we are made in Gods image, and ai is made in our image, then would…
ytc_UgxWelTOB…
G
This is a panel of people who clearly know little to nothing about A.I.. To say …
ytc_UgyBFHP_-…
G
for no reason? sora ai is one of the best advancements that we have made if it t…
ytr_UgztUqcJB…
Comment
Companies that displace a job with AI should be required to pay out that salary to the employee for time, and for every job replaced by AI they should be required to contribute to funding basic income indefinitely.If I work in a call center and get replaced. I should still get that income for the next 5 years to help me transition into something better. If I then transition into IT repair or manufacturing and get replaced by robots, I should be given the same deferred salary to help retool.
In 100 years time there could still be a crisis but by then humans may have found a way to live simple fulfilling lives off basic universal income.
youtube
AI Governance
2025-06-18T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwCUz5SWmui9Nyblm54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTxNMj5AtkyR0EmAx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxveLHQgZMMyYIT33h4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzNgMkZa5iJH9WcdRJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugybnxrhd6sZsJYF8xN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwmH0oLfRntngwD8ch4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxVNn4DaKBToIB98sp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzf5g64r-rP-9Q3h0x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwK9PQERP5buzMhmAp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgydFORy-ca_LZDJuGN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]