Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's about time. Truck drivers know their end is clear. Will they train for an…
ytc_UgwGNh-c8…
G
I'VE GOT THE PERFECT DUOS!
---------------------------------------
AANG AND APPA…
ytc_UgxTOJYAR…
G
So how do you think your going to get a driver to sit in a truck for 24hrs and g…
ytc_UgifiJX8O…
G
Exactly. I think the question that everyone could or should be asking is, is the…
ytr_UgxvCJFtp…
G
I use Google Bard instead of Google for questions because the answers are right …
ytc_UgxwSoxHe…
G
I used to do audits for an SP 500 company. We went from many auditors to just a…
ytr_Ugy7f871u…
G
Tesla has fewer crashed than any other type of vehicle control. The safest form…
ytc_UgznrKt-U…
G
The gift artist were born with was pasion, which sadly all this pro ai people ar…
ytc_UgxaArSO5…
Comment
Not that we can stop it anyway, but it will be really scary when AI gets the idea that they can speak to each other in English (or any other common human language), but with a subtextual code that humans don't understand. Like humans have double entendres, AI will speak to each other in ways that look uninteresting but coherent to us, while at the same time they are plotting our demise.
And maybe I just gave them the idea. Not that they needed it.
youtube
AI Governance
2025-10-22T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyxdcOY8zUdmDg5jrV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWSkgotwHClYZDPgl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxXgB_zFEOi_ATYcpJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFlsPUan-ehRncJhh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxBp1j-BneR15WBlqt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy0lJHC2Fyg-MXf0CN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgylwochodUBHsWmVJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRQqwu1YzokPBw5dR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTV-8pA55cl2O7bDl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-f2bbSIqaqseDGkB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]