Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"The facts on the ground changed." No, Dean, you were told this was coming. Not …
ytc_Ugxf2ysfr…
G
A question, what exactly does "Character" AI Chat mean? Do you tell the AI to ma…
ytc_UgwjqykeL…
G
I'll elaborate in a reply to the comment if anyone cares to read it from the per…
ytc_UgyEv3xqf…
G
Why don't you think for yourself then and think of the utopia ai can create you …
ytr_Ugy4oCGrH…
G
this is a stupid comment. AI art is trained off of existing art which inherently…
ytr_Ugx_t-mN7…
G
So were in a age were cars are killing people,this situation should have been ad…
ytc_UgzeR2MAo…
G
1. How dumber model(previous) can produce better, smarter models with this enfor…
ytc_Ugx6mKiv-…
G
Exactly. We could just invest in public transport infrastructure and trains but …
ytr_Ugy1yTMVg…
Comment
Banning the use of the word Autopilot is a good point. I think we all got too caught up with the hype of self-driving cars.
I drive a Hyundai and it's merely called steering assist. It'll give up and flash an alarm when it loses confidence on it's reading of the road and it doesn't take much; mumps on the road, turns, fading lines, hill crests. But as an assist to take the strain off of having to hold the steering straight on long straight and flat sections when your car wants to keep steering right because of the road camber, it's brilliant. To me that's just the right the right amount of automation because it will force the driver to stay involved in the act of driving.
youtube
AI Harm Incident
2022-09-04T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyO1_FlZgc7zGEtvQZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyfIOXETd9sQUahzfh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_TvcXG6hYfTSwO7p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxkvCeiCNT01fCODR14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnarXl0MyRYDBsFg14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJAVBinUtvxjiTIWR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugys9JRs1nf8MiPHxJR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgydBCva0h09ainfQ9J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxY3cnI2ChFduxI4t14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWHhWFRjz8FklDBGJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]