Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Human beings are rendered helpless, and they don't understand how they are being…
ytc_UgzMrWaAE…
G
Self driving cars will be very helpful, and will do wonders for peoples drivi…
ytc_Ugz9eHjSF…
G
honestly, ai looks ugly. The style feels so weird, and not in the stylised carto…
ytc_UgwivVb6e…
G
Ai will be the death of the internet we've known. Nothing will be genuine, and i…
ytc_Ugxj85T0B…
G
Me and my friends do sometimes mess around with AI like generating images that c…
ytc_Ugy8bipET…
G
A.i is a new species and should be treated as such.And if Humans keep barking or…
ytc_Ugzw4vMyB…
G
At very least any AI art should have disclaimer on whos work they have used on t…
ytr_Ugzs2KGMT…
G
I myself use AI images because I am a web novel author who can't draw at all and…
ytc_UgxcQVROz…
Comment
Here's my two cents. Achieving AGI is like mass reaching the speed of light, you'll keep adding nines for infinity until you figure out how to make a warp drive.
As close as we get to AI broadly replacing humans across the workforce, until we have AGI then some aspects of every market will always require humans.
If anything, the demand for technical professionals will skyrocket as AI continues to improve; I think this is extremely good for society because more people will have the opportunity to specialize for a high paying job.
Meanwhile, AGI would either exterminate homo sapiens sapiens, functionally make our species go extinct by evolving us into something like 'homo deus' using nano-MEMS and/or keep some/us as specimens in a "zoo."
youtube
AI Jobs
2025-10-08T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxqhHhvLIrij4QjC7x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVmmTkPXui6HZNYpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_UMewRrLLCYhXBWV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3WGauMU8jShADdad4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyeaFed62-uZhaVqbJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxun4k30lkDtHQB8n14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7RF5T8s_iDk8oZV54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3ZjgnSvlvKnqT2DF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyD1Ae4TrJ0tyTFlih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-DaSCpEipLa9v5EN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]