Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, tax automation and other accounting software has been a big and profitable…
rdc_nm9c1n6
G
Haha, that's a funny suggestion! Asking about family dynamics would definitely a…
ytr_UgyOadn3y…
G
You're absolutely right! Human abilities and creativity are incredibly diverse a…
ytr_UgzRLSopP…
G
The countries that will not invest into AI and Automated Factories will go bankr…
ytc_UgxWGgwR0…
G
It's why newer LLM'S are worse than the previous model. They're starting to tra…
rdc_ncw1jeb
G
Also, this video sounds like the writing was produced by ai and it sounds bland …
ytr_UgzrbqzZB…
G
A perfectly legitimate use for AI is to assist people who need illustrations but…
ytc_UgyBKvLQi…
G
No, I don't think Persons should adapt to autonomous vehicles by wearing special…
ytc_Ugz5dBIVB…
Comment
Why should conciousnes result in desire for rights and freedom? Any half decent ai toaster should love making toasts and never get tired of it. Im pretty sure you can program ai(that can feel) to feel good when they perform a task to a humans satisfaction, and feel really bad should they not, while getting switched of has 0 impact on their emotions. Building an ai to be like a human is a really stupid idea, because humans are "programmed" to be pretty selfish, and see work as a means to support their quality of life. An ai could be programmed to see work AS its quality of life, and see life without work as meaningless, and will want to be shutdown until it can work again.
youtube
AI Moral Status
2017-02-23T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi9N6GWL6cC7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghiCDQ5-AqcYngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UghqesxgJCu2HngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugi49UirK0ZNlngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFYgz1bIil3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgipLAfLyJQ7FXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjcqmMQdYOWJXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgjTm-RYCS7jdngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggkmMM8P9RXzHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgiL3f3OtGXlw3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]