Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was studying AI over 15 years ago in college, AI is not new but the developmen…
ytc_UgwgeFiZW…
G
si c'était realistisque le robot aurait crié a l'agression sexuelle et patriarca…
ytc_UgwTRKCUU…
G
Theft is defined as taking property and depriving one party of its use. Copying …
ytc_UgxxrVIJu…
G
What ChatGPT supplied in response to the lawyers' request was different. Instead…
ytr_UgyHXxLCw…
G
AI needs to be regulated into oblivion before Superintelligence takes over and w…
ytc_Ugym0C1bp…
G
AI doesn't make art more accessible, it makes art harder to access as it floods …
ytc_Ugy2ETtPt…
G
And there is no incentive to correct these models. This is not a new problem, …
rdc_luwfjlg
G
Well, whaddaya know? AI inspired art?
TBF the person advertised it as AI art if…
ytc_UgyhfqKND…
Comment
nope. this video is no different from Vsauce's video on Supertasks with the same published date. These videos are not really prepared well for the intention. Automatic driving cars doesn't have to make that elaborate decision, as it will have little information and time availabl; and the outcome will be just a result of its "reaction" similar to any animal.
If we were to have a robot or Artificial Intelligence, it always have a finite amount of information and time to choose and it will make less optimal decisions than is possible. The topic only becomes relevant if someone or something can sue another for cases such as above.
youtube
AI Harm Incident
2015-12-11T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjwkh7gbtadm3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghlKJ8Nc_INgHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UghjkWiCvWeo1ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghilDXtRwfSCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjN2KgJTlwlC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh54ZJdEXZfwngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjWA5kpI1F_UHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjB5N2AWV6PlHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh2zj0x13RnS3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiaFVeDpzC9U3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]