Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm good with living among a city of AI robots over a city of human beings any d…
ytc_UgyNfYpUL…
G
Between worrying about what Donald Trump is going to do and AI. I give up worryi…
ytc_UgzA-WGjf…
G
Thing is these people discussing it are gonna die before they will see the impac…
ytc_UgyCdZOCM…
G
It sounds like you have a fun idea for Sophia's style! Dressing her up in differ…
ytr_UgxHj6aTu…
G
My chatgpt the moment I sounded down because I had mentioned I was depressed. I…
ytc_UgyPlCHka…
G
good story but nahh let it happend if you think noone sits behind ai and says wh…
ytc_Ugw_FFt9E…
G
@silvianbruno7512 It is quite funny. 4 years ago, people would not even have bel…
ytr_UgwNm4-m3…
G
I remember that I actually quit art a while during covid, partially because I wa…
ytc_Ugy9scYNQ…
Comment
This is a very ideal, best case scenario (for AI proponents). The recent MIT study that showed 95% of AI projects fail to deliver. Also this is great if your company/country is heavily digitised and ready. Most are not, even the US. There is this perception that everything is seemlessly digital, companies running off edge computing or in data centres with all their data, processes and workflow automated/optimised. That’s simply nowhere near the reality I see in my corporate life. The AI might be ready, but the foundations it needs isn’t.
youtube
AI Governance
2025-09-04T08:1…
♥ 62
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyGhBBVR9Xb_DlOeLp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwpyVoEMqGt4p_DS3N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6Ibr3gdEp5VLaPDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy5hE0p0fcflUxvk2B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydmNxJF69tJJq_Nxl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw9kPiDfPjAn1yADad4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyNxvhunuQqRWkcqZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7Eda0GrwAi641dXp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzjnMOmJW4K6akSpMt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgybwcRhrdTaCPbK09t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]