Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The book of Revelations, I believe was talking about robots. It says that a st…
ytr_UgwTxa9Ia…
G
It is true but, if you are making a machine learning algorithm you know that the…
ytc_Ugiva0mil…
G
Palantir drone blowing up a toyota in afganistan with 10 innocents (7 kids)- whi…
ytc_UgyQW8Ki2…
G
Stephen Hawks Prediction was right, if humans dont stop building this AI until t…
ytc_Ugyci1qFP…
G
Should I do the crazy? I was crazy once. They locked me in a room. A rubber room…
ytc_UgyIiFCl8…
G
This man watches the BBC, reads the New York Times...and he used to be involved …
ytc_Ugwmwq2Hk…
G
This is an extremely narrow view to adopt to start with. What reason do we have …
rdc_du4h2jy
G
I love to use it to do the setups to be honest. typically when you start new pr…
ytc_UgxT5l7Cj…
Comment
wiping out jobs with robots and ai might be ok assuming we re write our social contracts to take care of people. if we dont, theyll be a lot of poor people with few ways( if any) to make income and that system will collapse anyway.
we can accelerate the issue by ensuring fewer people suffer over a long period of steady job loss by allowing this to happen fast, putting pressure on institutions to make the neccessary changes for this wonderful revolution where people dont have to die in mines or work 80 hours a week to have a life. we can just chill. or at least i hope. maybe its idealistic.
youtube
AI Jobs
2025-10-08T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx9t4mV6cLvkcZ4cDh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgynFDjvLhj_ddJ5XoN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx0GTzkIftYxjbCB8B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgxeZ_FRl0CV0ldgh014AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxsKpFdEsn9ad8W3hh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxwM9dneSp0jBS8V5B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwaBo7SBenViRU0Vmp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzgSNW-TiRuJN_oNxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxFEBmDm8TdyvBqzPl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzUktp78uS9wpgv69N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]