Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well we had better find John Connor quick before all this takes place. He;ll kno…
ytc_UgziKoq-W…
G
The AI didn’t solve that the person who programmed all the information into the …
ytc_Ugwzm5Sy2…
G
We are The Borg. Resistance is futile. You are already assimilated. (This messag…
ytc_UgxZujnyg…
G
The difference is accountability. Is someone going to prison when Waymo kills so…
rdc_nsz2slk
G
They wouldn’t fight it if it wasn’t a problem! Nightshade and Glaze should be us…
ytr_Ugx7hkvdF…
G
Its all fun and games until the AI customer service tries to scam you 💀…
ytc_UgynRNBXc…
G
With the release of Tesla's v14 neural net suite that learns from driving footag…
ytc_Ugz38V3mA…
G
Arc Raiders was an AI psy opp by ai to see how humans would react. According to …
ytc_Ugw9cfxuE…
Comment
The end game to AI replacing SWEs isn't just that SWEs have no job, its also that any purely software companies employing them go out of business because they can't compete with AI companies.
I think this is why these mega software corps are investing in it. It's to hedge their risk. What happens to Google if Microsoft develops AI that can make software end-to-end but Google doesn't? Google can't complete for the cost or at the scale Microsoft can. And they probably go out of business.
youtube
2025-03-13T00:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwixWDBk_2c9OSGEpV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMYM1OHN3oZf1dhSl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgznTmolZ_Ap5N4r36l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhuD_XklrMs8KPAmd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqCyJzufCy1bXkvm94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9uHlBPLO8vJw-O3N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGc-IcFnltI58jsqp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz6BnCVNH-clBQ4OaB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGMgmvCxI2Lf8DHJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxvKokdm3mDnqF61aF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]