Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
is just the beginning.
all the difficult task like accounting and payroll can a…
ytc_Ugy6DAxdv…
G
You say that copyrighted works are in datasets (models) but they're not. They're…
ytc_UgydOwE0u…
G
Logic, numbers and elementary math in general have never been lawyers' forte. Fo…
ytc_UgzTCqveF…
G
1. I do not believe AI will take over. I believe evil people using AI are taking…
ytc_UgyNikipa…
G
AI has progressed exponentially now, no comparison to earlier bots. Talk with Ch…
ytr_UgyNnQ6DM…
G
Its good to know the San Francisco Police Department has not yet been subverted …
ytc_UgxTELPiX…
G
The 1% no longer need the 99%. In a few years or decades, only billionaires and …
ytc_UgyIkodX_…
G
Oh nooo digital drawinggg thats sucha way to cheaaaat
Seriously, I can’t draw wi…
ytc_UgzWLdAoH…
Comment
as a student in software engineering I wish I could mention to the scientific community as a whole "stop researching AI unless you are prepared to nurture that AI". From the dawn of human existence we have made tools to make our lives easier, or worn out our tools so we don't wear out our bodies. Making machines to do our work for us is so we don't wear out our bodies, making those machines sentient means they won't want to wear out their bodies either. So we should just plain old not make AI so we can keep the machines as tools we use to better ourselves, rather than turning machines into our alien offspring.
youtube
AI Moral Status
2017-02-27T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg3xHoUtx6gWngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UggE8ZCLy_Y7-XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghT7BJ_Jkv_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghD0PHvZSddz3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghzdlAYEf702XgCoAEC","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi0wlK0xxTZ3XgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UggL_n6lQWteeXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggLBNtGHpEtHHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiX-KqMqEVNV3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgivRlFZ-T5UaXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]