Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would be very interested in his answer to the question "Will it be possible to…
ytc_UgytGlM_0…
G
Look up a series named: "SITUATIONAL AWARENESS - The Decade Ahead" written by Le…
ytc_UgzwpUW3n…
G
When you ask questions about control and hidden forces, the AI just completes th…
ytc_UgxbmrApM…
G
Losing control of AI is gross incompetence in system design as human engineers. …
ytc_UgwiDVf8j…
G
In the end, the fire wasn't rage, it was the microwave on fire, like ai if you k…
ytc_Ugxahlxao…
G
Looks ai to me cause i dont see your signature there and second it does not matc…
ytc_UgwOh1Z1m…
G
The most dangerous thing about AI is dangerous people controlling it. For exampl…
rdc_mbufgn5
G
At first i was like...hmm idk, what are they gonna learn? But then they got me a…
ytc_UgzTVLlKC…
Comment
Im pretty sure all he says will be reality in a near future, however AI might have the knowledge to do all he is saying by the year he predicts, but som much equipment need to change for example transports, to change all the trucks in the world to self driving intelligent vehicles is going to take a massive investment on the company side. not so sure this will happen as fast as predicted the amount of money required is overwhelming.
youtube
AI Governance
2025-12-10T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzByY8yCi9ddD7P5p14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy67SWPHGkooo3JbPN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwHzPiMWXIPQfmHhiV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyNu77TW2xgqfe8Ro94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQ43Uy04efFF0dGaV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxodt6AVDBvFzZzkbB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwuyc1rfaVq6PmgRDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxNgg4PtOWfaQOAwWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhAKtXRfer-44MYBt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxQ5mUqCWgQLw1TQp94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]