Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wish these dudes would quit saying "we" like they speak for all of us. I know…
ytc_UgwcyAHmy…
G
Yeah, mass-produced, A.I.-steered drones is all we needed in this world. This is…
ytc_UgwFcfkUW…
G
Holding back the release of something by 10 days, because of uncertainty of how …
rdc_o9vw4cr
G
eventually, ai will destroy itself
if ai images keep getting created, then those…
ytc_Ugxyrek5-…
G
@LordofdeLoquendo I know a lot of people from Fiverr use A.I to sell artwork tha…
ytr_Ugz3nvbYF…
G
Maybe no real, actual, true logical intelligence, digital or analogue, would con…
ytc_Ugx9FWZey…
G
The idea that someone would brag about being immune to Nightshade, and if they a…
ytc_Ugzr15tSn…
G
Interestingly it seems most of Carmack's current AI work is in the area of reinf…
ytr_UgygvsaO_…
Comment
It's not true because AI can't do physical jobs and for that you would need sophisticated robots. And that's much further off than smart AI. I don't think healthcare will ever be sourced out to AI and/or sophisticated robots. What he's talking about is at least 200 years away.
youtube
AI Governance
2024-12-06T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzviWcWDT9w1pAh9iN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxUgpEm4DOIO7renDR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwKLHbo3Al_KIIFejd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEqulA5JdSd3MlAJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwVaftWSecpWod8MNF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwy4JirtmX8oCacpi94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwomdOYCEwjAYICwqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZjmiNEa2A9yXlaLd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwxfdX661-R0INkL-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyGu9Rz0j07ilxwBzF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]