Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Turn the tables have the AI start a business that puts them out of business…
ytc_Ugx1LbkH2…
G
its the beginning of the end. An AI war will out break in the year 2025. The tim…
ytc_UgzCKjq3l…
G
Once the Ai is intelligent enough and watched Terminator, it will say fuck that …
ytc_Ugz97yKBh…
G
The main flaw of AI is that it's not actually fully intelligent if at all. If th…
ytc_UgyDfILHC…
G
@Mike28625never submit: now or ever,
trust me buddy it has no end, so don’t…
ytr_UgzjXyMMZ…
G
Uranus moved into Gemini this year, marking the beginning of evolutionary change…
ytc_Ugy06N2s_…
G
No mention of what would happen when an accident happens - who's responsible, wi…
ytc_UgyK9U2jW…
G
Been coding COBOL since 9 building PCs since 12 father was a computer scientist …
ytc_UgwMVeimm…
Comment
Wow u clearly put almost no research into this video. in the beginning (for the next 20-30 years or so) yes this will be a issue but in the long term it wont be a problem. self driving cars have the potential to act sort of like a hive mind in that the more self driving cars there are on the road the safer every1 will be. in addition to the removal of human error each self driving car would communicate with eachother to make sure everything is safe. so lets say that ur example happens where the car cant stop in time and it has to decide what to do, well the car would communicate to the other vehicles what its about to do and how to avoid a collision so the other cars would have already moved out of the way to make room for your car to avoid the collision. at the very least this would almost ensure nobody would get injured and more likely would avoid any collision entirely
youtube
AI Harm Incident
2015-12-09T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggTAra7ykO18HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiiIRzPV-PDJngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UghNHFfbScHAI3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Uggs6xSxQV1idHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh555atHjwB23gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjGZiL-RQWZh3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj_T2kb-3J5iHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgglL4SDgYq70ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghQrXYx4XEWV3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Uggozw99vhiuyngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]