Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks for the comment! I wish I read this before I made the video, would've inc…
ytr_UgwnbzZTw…
G
Wait until a bunch of horror accidents will happen…and it will happen without a …
ytc_Ugz3LwGbB…
G
I have yet to encounter an AI that is smarter and more accurate than me. I ask i…
ytc_UgxBRQ_La…
G
@Dantick09 humans learn much slower than robots. Not saying AI aren't better but…
ytr_UgwSPFf1b…
G
Copyright doesn't matter how you create something. It has nothing to do with tha…
ytc_UgwM21iF8…
G
Idk why is everyone is hating on him and the companies that are benefiting from …
ytc_UgzNjCVFQ…
G
AI is so not good! It literal had to steal, plagerize to exist. It's built on a …
ytc_UgxWB_1hu…
G
GUYS drawing is hobby, dont look at AI art, you are drawing not to be the best b…
ytc_Ugzlj9QHY…
Comment
I can totally see how A.I. is the last technology we ever create. Hereafter it's only logical to think that A.I. will create every other technology that will follow, if any.
And once a computer program can "improve" and modify itself, NO-ONE can predict the outcome. So yea, it's probably wise to prepare for the worst. Don't think your millions in your bank account will help you when D-Day comes. Rather use it wisely NOW, before the millions become worthless.
youtube
AI Governance
2025-06-23T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5OnOq1apyhxfSInd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6GMwGphlc4zcAETB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyplHmgyL3envBc9i54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3QhWnjyuD8NIAG1t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwWfjVYCJpa52r7Cb94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyR74bbngVvomG_UQh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwo5CEQJ8pbMaoUsq14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwi7uMkQj4_bJB2BeZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtX2prwNhbxONBT9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzbgJShS7Z7OsqVSm14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]