Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I got for this don't tell AI yout personal info everything else is alright😊…
ytc_Ugw1PtVaZ…
G
What if the AI set target to certain people because those people are the one tha…
ytc_UgwQwX48u…
G
The term AI artist is an insult to every person who has ever lifted a pen.…
ytc_Ugy7eZsha…
G
If I ever have a table at a convention I WILL BE ACTIVELY DRAWING. They will be …
ytc_Ugzipvnns…
G
AI writes a nullref fault that bricks 80% of computers around the globe.
Oh wait…
ytc_Ugy2Gzppo…
G
Im so sick of this Hinton jerk telling people to become plumbers. Why don’t YOU …
ytc_UgxY3w87R…
G
ai is gonna destroy the professional srt environnement, but we will still make a…
ytc_Ugz7UK9-k…
G
Someday parent's will have AI bot baby kids, no more real kids, and best part th…
ytc_UgzfUampe…
Comment
Look, I am sure professor Korinek is much smarter than me. I am slightly skeptical though, of people who mainly spend their time in academia. I work in one of the largest corporations in America (telecom, take a guess). Although there is some progress, this company runs on legacy hardware and software. Even if we have AGI in 2-5 years, and I think we will, it will take decades for the physical world to catch up. With the exception of industries that don't have a large physical component, like finance, replacing and creating the physical infrastructure required for AGI will take much longer. Additionally, AI needs to be monitored, jobs will change, and we need to change and adapt to it.
youtube
AI Jobs
2025-06-13T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQ1n5_87I3ZaPIY7d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-d40ep_9vIVjowF14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3peCSCZ0E5SrhaFh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyaQfw3QDQp8ysyji54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJHxUxNeJjz5wTmgd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXY0CJUAi0Qcz2JWl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEDjzLQD2mPhmQk-B4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLsSzn-XQGxzEN7r54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgznJvAVzcjjt9RygEp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzoqGT0nswwOItF62F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]