Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would rather have the freedom to drive my own vehicle and take the chance of d…
ytc_UgwYW9G7f…
G
I'm finding them really useful as a productivity and information tool. They help…
ytc_Ugx4cf1xL…
G
That's a great suggestion! In the video, Sophia emphasizes her continuous learni…
ytr_Ugwcp6OMG…
G
Good man Blake! I know next to nothing about AI, but can the AI access its own s…
ytc_UgwsJYDik…
G
Everyday on the news we here something like taxes need to increase, the governme…
ytc_UgxJ3Dkvn…
G
1000% AGREE, as a consumer, as soon as I hear that Ai voice, (we all know the on…
ytc_UgxUHKdb1…
G
however, we do not know how conciousness is related to intelligence. Maybe they …
ytc_UgzWV5x9r…
G
I find it hard to understand how anyone can think this is AGI or even close. The…
ytc_UgzVCB2pP…
Comment
so here's a fallacy: lots of people *had* to die before we made cars/planes/etc. safer -- thus, releasing dangerous products for public consumption and fixing 'bugs' as in response to deaths is a good strategy for this new dangerous product;; holy fuck, stop socializing the costs (unless you're going to socialize the profits) -- we don't need AI until it is unlikely to ruin our lives/kill people/cause death;; just gtafohffs
youtube
AI Jobs
2025-11-19T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7AqhqAZNCC4_v4KN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwUZ67etWZHdUa1q1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwxk_7TUQkXU-17M3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwckXWnQogs_rJuza14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYrylvqNDYRb4ZhUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxyaERSSVoRvngihx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwyk_U6H55zAPACFZl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy_3rBbf-EVTlP5afx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6G5QoUNQM-jYEyGR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyYbaCx_DJYvzdg3lt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]