Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI models these fools use are fed CP as training data for image and video ou…
ytr_Ugzw2eRZy…
G
Bernie I must be missing something , if AI takes jobs , then nobody has an incom…
ytc_UgwEJ38lM…
G
Analyzing and understanding new to you information is a great use case of AI. An…
ytc_Ugy0bkZcj…
G
To fellow AI researchers and students. I am sure you are reacting the very same …
ytc_Ugz5at2qW…
G
Both are not AI because I saw my drunk ass friend did the same thing as Clip #2…
ytc_UgyvPhrGU…
G
I called AI a fancy search engine and it replied “I’m not a fancy search engine …
ytc_UgxGmhRGc…
G
Correction: For this to work, you just have to be human, on earth, at the beginn…
ytc_Ugw8HvhhO…
G
* Humans are **smart** but slow (10 000 hours and all that)
* AI is **fast** but…
rdc_mzydnd0
Comment
Why don't these cars communicate with each other via an RF link or the Waymo company intranet they have anyway built-in? Let's hope they don't go back a 100 years to invent they could use their honks to Morse-code json objects to each other.
youtube
2024-08-27T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyv6HjWr7ssLNwRsEl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzjLgzSnzuB_eyXt0B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwDdr_T97aks58WM0R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVgc5BOD5aumm1Xqp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyYc_wGdXafx4stDNB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzum8HU4cJlf4I-Wy54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6pX23kaQsUGQWafx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxD0tYeKSy_rlX3bxV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwQiipNZJvrMgTTyhh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwd8jftGaa3tkYfmFx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]