Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
_Elon should never even implemented Auto Pilot_
Tesla just put its own name to …
ytr_UgzkOvDpc…
G
One right wing extremist known for using the ai excuse is called out by another …
rdc_nc8d958
G
And not a gottdamm one of em doing any work...just sitting and drawing a check..…
ytc_Ugyp3en7S…
G
It already shows it not smarter than humans. The interviewer asks “what does Sop…
ytc_Ugxy63JxL…
G
Make companies fire all SWEs and sell your product, now companies are reliant on…
rdc_m6xssbp
G
Well.. forget about ghosts.. Now this robot and it's evil expressions scaring th…
ytc_UgwtP29db…
G
I remember when they told us AI wouldn't replace our jobs.
Funny how everyone e…
ytc_Ugwo5py9q…
G
The only things I've found AI actually helpful for, in my day to day life, are m…
ytc_UgxN3-eJS…
Comment
General intelligence isn't the problem, its the system around it. If our economy shifted then who cares if AI does all the jobs, we'd be free to do thing we actually want. The issue is just with making sure everyone benefits.
And no, general intelligence doesn't automatically lead to a super intelligence AI takeover. Thats like saying humans can learn therefore they're all going to turn into mass murderers.
youtube
Viral AI Reaction
2025-12-03T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-ePWEQj3S2gPJSsF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmpDy0qTcXVwACsrF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz1fWZEz3IsEkqQ_sJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJG6sqQKhPn8vTrId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwwW6QF0PHaZt6SzYF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyCTc-_wNcMUhiYWL54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxaA4CiGVMNeORtpNN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPc1GFS-kQuGamHR14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwdmC0bpjKragU7uR54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAAJhx48AsRTFRoEB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]