Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Look, Cooywright IS IP Law & bypassing IP restricti9ns of the US (overbroad as t…
ytc_UgyjDfwZW…
G
As someone who role was supposed to be replaced by AI back in 2012, this is come…
ytc_UgytgIk5M…
G
The host is talking about AI taking jobs and he talks about having a self drivin…
ytc_Ugzda9Hyz…
G
I’m tired of discovering ai through the algorithms sent to my feed there’s very …
ytc_UgytMnvOl…
G
The most underappreciated moment in this conversation is when Altman describes a…
ytc_UgyFfYgJT…
G
Technically speaking ai can't be racist it's literally an example of correlation…
ytc_UgwSCQWFR…
G
That's easy they are robots because because it says AI robots didn't you write a…
ytc_Ugz7dZAvZ…
G
God damn, Bernie. I love you so much. This needs to be scrutinized now more than…
ytc_UgydFFyJH…
Comment
Funny thing is yes, if they are thinking feeling beings they deserve rights. On the other hand if you have to treat them equal to humans it pretty much removes the robot's purpose of cheap, consistent, reliable labor. If a robot gets breaks, wages, has the ability to do sub par work or slack off like a normal human, why spend large amounts of money to build one when making humans is so much easier? There becomes a point where if a robot becomes too human, there is no real point in being a robot any more.
youtube
AI Moral Status
2017-02-24T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh0c4l23P6EYHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgizmdfK6BHeengCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiS9-lmbu6FW3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggg7_XeDnLEkXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjlRCoviv8l7XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgiAi7l2Sx79l3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiM-TwLKWJZ13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghE_QrjN0MWgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi_n0NFADJiGngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Uggf753UlzgQ93gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]