Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I once spent 5 hours making a process run in 12 minutes instead of 8 hours. Opti…
ytc_UgxYyALdq…
G
The danger is the escalation of this, and how obsolete any other military force …
rdc_feyymje
G
"We need a world where people live healthier, happier and more fulfilling lives.…
ytc_Ugwp_WT66…
G
TheMMOptimist You didn’t watch to the end of the video where they show examples …
ytr_UgxtvrRb_…
G
AI is not good for the human race. It is death. What happens to people when they…
ytc_UgymGo2Ot…
G
What if you split ai up to optimise for 1 thing at a time, so the first for exam…
ytc_Ugzik2k3T…
G
I said god bless America and my ai gave me a whole ass American speech…
ytc_UgxFmFmEd…
G
Won’t AI gonna hit economy negatively impacting loss of business due to unemploy…
ytc_Ugx_Ym_Fa…
Comment
On the topic of AI, if you’re plugging in complex problems into AI for it to do all the work does that regress human intelligence? I feel like it’s far different than let’s say a person using a calculator. On the other hand you have AI, it does all the work for you and shows you how it’s done. Just curious on everyone’s opinion.
youtube
AI Responsibility
2023-01-04T19:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyB8O4r26BKkVhgSAd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwMt4VDZrirQpG9_xp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0Noc4JpdmhDDpcC54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxxyzhSoOKzxtIQ0S94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxmyg2g2pVPmRxKUgp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOMAAh7tVHlNNaUIt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy57WVKesUo2lZb95Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw2DW40OCu-jdFnBjR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzlRY6_3Sg9FQQ0v9t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnfCI3dhMDOy8Uj9p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]