Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There are more ethical uses than using the normal consumer versions available. R…
ytc_Ugy1GCUvO…
G
Not actually true, the tech was a natural course of evolution. The issue is how …
ytr_UgzfhQLxW…
G
Sadly he isn't the only one who passed from AI, Adam was abt 16 i think and he w…
ytc_UgzzbF8jT…
G
22:17 He’s saying we can have a utopia but we need a political revolution first …
ytc_Ugy8vckkV…
G
oh this case has already been litigated by the monkey who took a selfie. only hu…
ytc_Ugzs4L0tb…
G
All AI has to do is change some numbers on the stock market and let us take each…
ytc_Ugx095U6p…
G
Sorry to bust the bubble but no AI is not better than human. CEO are just lying …
ytc_Ugx8PyfD8…
G
1:30 in, devs should understand this exactly:
1. Deep expertise ie Sr devs - ev…
ytc_UgwLXwo5y…
Comment
Contrary to popular belief, AI will NOT take over our jobs. Not at least in the next hundred years. Although, when you think about it, that is just three generations away. AI will not be that reliable as long as it's not logic based. Apple already pointed this out (also maybe as an excuse to their own AI woes). Currently, AI are just script-based systems. Not nearly as advanced as most people think when it comes to real world technology applications. Even after a hundred years, AI will most likely take over clerical jobs, & recently automated processes like operating a vehicle. As for when will take over manual labor - same estimate as when our great great grandparents thought "flying cars" will be the mode of mass transport in the year 2000s.
youtube
AI Responsibility
2025-11-24T05:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxBaFgxNGd9xVDx6jl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjYiNcKwF3YL_npIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzz45IVFqYabVsOfet4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzIyDnOmzgPCiKgqod4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrHe5s1BS12R97yqZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTm35G54OzZeyua3Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyBIeo4W6Nc20GZdhd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyPhfLmHSJhgsl2mwp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwJGtBQ5VpgiRKhJSB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyEBi2Bs0YivanUP-h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]