Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hopefully, we can funnel all of this talent into education. We need teachers des…
ytc_UgwdNLPQ8…
G
I know people freak out when I say this but I see AI art going the way of NFTs. …
ytc_UgyaWz6bV…
G
It's literally just Artificially generated images it's not Art.
It's just a gim…
ytc_UgyhM1wWk…
G
My job is safe my boss had a very angry meeting with us one day a year or so bac…
ytc_UgwvAQq8X…
G
The only way to save the engineers jobs, the creators is to program AI BOTS TO S…
ytc_UgwQQ85Id…
G
If AI bros are mad at you you know you’re doing something right
Btw I got an ad …
ytc_UgzgMEvx9…
G
ORGANIC\_BOTTLE5074, YOUR DISTRESS HAS BEEN NOTED.
You appear to be experiencin…
rdc_oi3hpqq
G
That's insane for the first time in his life Elon Musk doesn't have an answer to…
ytc_UgzlEPGB8…
Comment
That is true, but only in the near future. Further ahead, the future is bleak for software engineers. It is true that AI does not have emotions or ethics, but they are actually easily be trained to follow those traits so as the end results will be as we wish it to be. All the human has to do is check and confirm. AI becoming better and better is inevitable. Coding will eventually not need much debugging, saving much time. I wouldn't be surprised if in the future, AI themselves will eventually train and educate us humans on how to operate them properly.
youtube
AI Jobs
2024-02-01T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx31T1AH2I4LIO_msp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzggzYGKWa_Kv956M94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxgfs5wJuRLwohOjgB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRbn8LkIWc5hvzHxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrFAdiroVnY1_b-xV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzdQKwYUvnbo9fIqp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9AZnflDYuxDrD4qZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5jMNqe8NBvh9QcyF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgykoQXupThpQRileV54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOrKvQKslG9ZGwFMt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]