Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont see why people want robots to look like people. I would want it to look l…
ytc_UgjPDiUeP…
G
Sure the big corporations including his, who are pouring in billions right now o…
ytc_Ugw66MFUF…
G
Oh so China won't have to steal America's Intellectual Property anymore - they c…
ytc_UgwbUfr1o…
G
There's so many ai ads and bullshit around, this is at least something.
I hope …
ytc_Ugye-MJZB…
G
In my opinion, LLM-based AIs can't reach AGI. They will always lack their own cr…
ytc_Ugws0vRXy…
G
I say, let the chips fall where they may Ai doesn’t share our prejudice towards …
ytc_UgyRfSfio…
G
Algorithmic bias is actually really tricky to deal with. It can be mathematicall…
rdc_e7j7mps
G
Yeah they eat from electric they will feel they might get jealous over fighting…
ytc_UgxrDFHsH…
Comment
Yeah, if AI at some point in time is able to bypass the humans altogether, even in just one field, AI-companies will surge much sooner than the human companies will be able to fire their human workers and restructure their processes. Fact is, AI right now is still a far cry from being able to turn the wheels of a complete business. So this all hinges on AGI or not. Right now, we're not. As soon as we are, nobody will need to replace humans, a parallel AI market will grow so fast it will simply put all human companies out of business really quickly.
youtube
Viral AI Reaction
2025-11-24T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz9EnbaYqsZw-JZc1V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwtrwB5tv2biVzD3Mh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyuRq1SL7dYUcaZL594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzw5PW_uiDbnCGs4ph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLTKzzCSnILh-47aV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx-McQpqV6zR44gN2B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx1DqBlXTLqB4L79Tp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwwuol9az0DQEZOaEp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhEYm8iPJHTbssIoB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxEy7qatnUpQFnPvOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]