Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They want the global market to breakdown, they want the ai bubble to pop, so the…
ytc_UgzXzIKkr…
G
I think very soon people will get bored and tired of reading AI-generated conten…
ytc_UgzvxMTMc…
G
So now, the 81% of REAL welfare videos showing blacks complaining about welfare,…
ytc_UgxlcBjOM…
G
15:20 ai artist rely so much in robotic stuff they forget that if someone is pas…
ytc_UgwAGfXkw…
G
Wowwwwww, this is crazy. My husband was using Gemini Pro 2.5 and it convinced hi…
rdc_muow3vv
G
"Just two more weeks and the bubble will pop, bro!"
Have you ever considered tha…
ytc_UgwcDuNR3…
G
@Hungusdungusthethirdfair but, Japanese movies are great and I can’t understand …
ytr_UgyXXgRRv…
G
I'm 49. When I as 16 it was computers that would end up replacing people. Later …
ytc_Ugzdze_f_…
Comment
I applaud this man patience with you. Idk if you were just playing devil's advocate for the sake of conversation or if you really just don't get it at all but damn was this a rough listen at times 😅 your comparisons to fear mongering cars and silly stuff like that is irrelevant to a self teaching autonomous agent. Saying people wouldn't rush to trust AI is a lie ppl ran to let a tesla drive it around. The general public doesnt actually push back on AI at all. Humans have misused/misstepped with EVERY piece technology. We will here too. The only question is how much damage will we cause? You don't seem to understand that the problem is the exponential growth That comes when ai is released to the public.
Edit: just got to the devils advocate part thank god 😅
youtube
2024-06-29T04:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzVMpoQTwl77oyyzK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVTzGqDVa6Gocdp_N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwTmXflsrZvOqsydQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqBv7kY4-LnkdKFu94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2XUIFHC_UVxXR1GZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8ctBEM7ir0D9WzlV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXtSwI8t76z5xC7jJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz7u0pBS5mp3_3BOZJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx8HzK8h1vc-HEe2Ul4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzAJQUv7UPQjENtRep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]