Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI poses a potential risk if it is not regulated. There must be rules and regul…
ytc_UgyahBhPZ…
G
The last time unemployment got to 30% the economy crashed - the 1930s. Once tha…
ytc_Ugzn3yK3y…
G
Who will buy AI if they are fired and bankrupt? A company’s employee is other c…
ytc_UgwTd50z2…
G
Ai bros crying in the conner for not allowing taking other people's work that ma…
ytc_UgzCumOyZ…
G
I love capitalism and profits. I've owned businesses, made ---a decent net worth…
ytc_UgzyE-NId…
G
The other robot: *messes the box*
The robot: ARE U KIDDING ME IM QUITTING BRO *…
ytc_UgyxgCofp…
G
Robot 2:oops
Robot:Bro
*very heavy slam*
Worker: Calm Dawn!
Robot:Shut the f…
ytc_Ugxw_sQ03…
G
The actual problem is that people are just allowing some database to do ALL of t…
ytc_UgzYc927d…
Comment
What Neil fails to grasp is that the world economy depends upon wage and salary labor in the form of jobs. Most jobs are far more routine than we care to admit.
As such, these jobs that are currently occupied by humans are quite prone to being turned into algorithms and transfered to computers.
Yes, clearly, there are certain tasks that require high levels of creativity that, for now, only humans posess. However, they are a small minority of jobs. Most people will not be able to become creatives to earn their keep.
The few geniuses employed will cease to be employed if the unemployed masses do not have the income to support them.
AI/AGI will be more like a tsunami that wipes out economies, not a gradual, rising tide that we can adjust to.
youtube
AI Moral Status
2025-12-03T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyShgWmglNZOUOh3ZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw22Ek4Ua39E3RglPZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwIs7jZORhb2TKHN3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzJFLlJCe5K__dQMSN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTsuPMp-imLzChufB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIJ4DxPHrZzGsSpWJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0qyBkgJ7-LSZm5Wt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkYZItVoUs9tdPWu94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwca8gSTD_0VyTmzUt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoTXqAUctK9ZizTc94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]