Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
2:22, it shall be nice to see ai learn doctors are wrong because their notes of …
ytc_UgyJI4GH5…
G
This is actually so dangerous for any public figure, because imagine forming a p…
ytc_UgxlK8KJ-…
G
A question I find interesting to consider: how is a human saying deep/spiritual …
rdc_my654oz
G
I honestly have no idea what I'd use one of these things for in my personal life…
ytc_UgyCsJhWp…
G
Don't forget the musk robot that will load and unload the semi trailers also... …
ytc_UgzQ4QK4m…
G
The first thing that popped into my mind after the AI as mother idea was what ab…
ytc_UgyNQriov…
G
Well i wonder which of God's children is responsible for this, i need say no mor…
ytc_Ugy4hBfIm…
G
I recovered the names of those responsible directly from GPT in a file that Such…
ytc_Ugw72wThn…
Comment
While I don't believe that it will cause an apocalypse, the problem is that the potential is to fully remove the human from the equation. Not all jobs will need the regulatory capture to ensure those jobs continue to exist. Healthcare is unique in requiring this due to HIPAA. But the potential is that jobs (managerial, data entry, creative, engineering) can be removed if agi/asi happens. if the machine at some point in the future can think and be creative and can do it autonomously without a human being in the loop (assuming agi/asi), at that point (as an analogy) humans become the horse at the advent of the first motored vehicle (don't believe me -> go look at old photos before and after).
youtube
AI Jobs
2025-10-27T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzurRf7haIJ8T8GQ-14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0gj2K7UdFRO6g0UR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzxcdoo-ix-b5X7wYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5XGa43BDnQ7EFrnJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx42F6SSo9kO_vAQFl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxuQ8Z0JwzaWWX-0vZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxqdwIh6nVKE94RcZp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugxrh9L4amjriku57ft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSy7wk4XUejnyz9VV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRyzdzerV5a9dtAhd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]