Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Neil deDrasse Tyson said that based on timelines of evolution of technology, a …
ytc_UgycJHmsB…
G
If you need AI to tell that mf was shoplifting you should’ve been held back at 4…
ytc_UgzX1JKCm…
G
We laugh and applaud today . One day in the near future, humanity will be decima…
ytc_UgxEB8NQt…
G
You should be more concerned with what humans are going to do to ourselves and t…
ytr_UgxDk5iqr…
G
I work in the gas industry in the UK. The government just approved our technolog…
ytc_Ugxtu-Tr1…
G
AI art isn't going away though and I'm in favor of it. If someone wants to gener…
ytc_UgzEzrkVZ…
G
YES A MILLION MORE EPISODES OF HUMILIATING AI IN WAYS THAT WOULD SEND YOU TO NUR…
ytc_UgwlBefZC…
G
You just don’t want us t use AI because yin will be out of a job…
ytc_Ugz-91L8r…
Comment
Very interesting views. Though I agree that some jobs will disappear in certain countries and cities, I don't see this happening in third world countries. As an example, to replace 10 or 20 labourers on a building site (who gets paid minimum or basic wages) with 1 or 2 robots that costs millions each would not make business sense. So this is really context-influenced.
Another current example, robot vacumers are still not commonplace in many countries. They are getting cheaper and cheaper, but still not commonplace. Perhaps the timelines are more extended than currently predicted?
youtube
AI Governance
2025-09-15T15:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyxHi-gnFLnlhSX_9N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVLAf7VVXQIus73Dd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz4ZFKeh5HkO6TsJO54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1TUdh5L5zuW_6hAJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZGF0m8k3lAqOT8WF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwL_4wIDfCElEWoPJ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNZUDj5BFzj1rWA9N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYKdXmOuAIAtgpJSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZBKaVqBgXXeTPeUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLzVVp-pUBhex3KgF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]