Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This robot is the next step for the Illuminati do not be stupid people nwo is so…
ytc_UghlBJvFm…
G
i think a huge part why AI is so prevolent is the perception that you have to be…
ytc_UgwKuUCr6…
G
This guy should’ve asked ChatGPT for that answer cuz his was total word salad 🥗…
ytc_UgyPUN35-…
G
We should not invent robot because in future it really destroys the world please…
ytc_UgxEr_S91…
G
"not facial recognition or people"
Riiiiight sure
On a side note, anyone know ho…
ytc_UgyyFr6hA…
G
So I am an artist, and i tried Ai to change and merge my artworks from curiosity…
ytc_UgxxqI88_…
G
I'm not overly worried yet, Just now I was using ChatGPT 4 to get some input on …
ytc_UgymMMChT…
G
Just the simple FACT Lemoine was fired for voicing this Ai REALITY .. shows us G…
ytc_UgwcjcjCG…
Comment
AI race would end with a big distaster just like the nuclear race did. Some country or some terror group will turn half of world's wealth to zero. Our money wont be safe online anymore. Houses wont be safe either. Such a big fraud could cause another nuclear war too.
Alternatively robot armies might destroy a city automatically. Some big disaster will definitely happen before we wake up.
Or we can learn our lessons from the nuclear round. Proactively prepare for disaster avoidance without slowing down the AI race.
Again its the Human stupidity that only listens to explosions be it online or offline.
youtube
AI Responsibility
2025-05-22T01:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwG6EVp0ebYHSYeEL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzITkFaWgclkXhuiml4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyaffFFNgaInKKH4wF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxUECCHVsaRbVF6XiB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxG18gOOlravQe2SWZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrupSM3gL46TWvxxZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUjlA6D0vt-8YMD694AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyL-OvW5hOZY-4itrp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDj_c0W-_4mAiHN7V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxkjfmbq_aYfYUXmEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]