Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even movie maker will use AI so how can we use AI and sell AI art without infrin…
ytc_Ugy-VVL9t…
G
4:54 This is where you lose me, why will the AI suddenly decide to wipe out huma…
ytc_UgwJJ6rDs…
G
I fucking love AI when it’s used as a tool rather than as a replacement for the …
ytc_UgxaERUqK…
G
You pointed out cars going through reds when they were ambers, you complained th…
ytc_UgzV7Qdlc…
G
AI art is not art. Art is human expression, no human no art. inputs into a progr…
ytc_UgxElc1NW…
G
If wealth gets distributed we could reach utopia. If it isnt there will be revol…
ytc_UgxBaADIk…
G
"X-risk, short for existential risk, refers to the potential for highly advanced…
ytc_UgwIxVfD3…
G
Maybe it till get to a point where some people own a few robots and rent them ou…
ytc_Ugxf1ljBq…
Comment
There is one major leap of faith being made here.. or rather a blind asumption.
"That AI is accurate and works as intended without flaws"
Alot of this piece is built on what the AI companies say their products might be able to do in the future.
Where is the skeptical eye on that mr. Harris?
I find it interresting your not going into detail as to why these multi billion dollar AI companies would be incentivised to go away from commercial applications into military ones.
The founder of OpenAI just said not to long ago that the 200 dollar subscription methiod to ChatGPT is unprofitable for OpenAI due to power consumption but you know that Mr. Harris.
I must admit i expected more than this, you set the bar to high for yourself in the past.
youtube
2025-02-06T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxCocH6M0U_YipX9tl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyTYnJbFt3IrAFcLIx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz60Fx0BU-4klnDz294AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKhbrk9tmstkWEm8p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvoojHHbKsASQKP4l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzeKDNOeO58x9SbNB14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEO6XHANpeWe5tGr94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZD_0BL_hDZY_czQR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz9mU7RKBrIyc68zdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9J2SDTVgUc5ENAsp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]