Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If there are masses (1/3 of population) of people permanently put out of work b…
ytc_UgzlYdKiG…
G
Lots of worries and little solutions.
"AI will take your job"
"You should work…
ytc_Ugz2kG4oQ…
G
not in the way you think if done ethically. freedom is exponentially greater tha…
ytc_UgwAnR_9b…
G
As long as ai art is distinguishable from normal art and as long as it's used wi…
ytc_UgwdR6xBV…
G
Imagine being in a war on the battlefield, and in the middle of the firefight, a…
ytc_UgxSqbvPX…
G
okay I have watched the full interview twice now and honestly, I just don't see …
ytc_Ugy_UfQv7…
G
That's conspiranoic. It would be better to say that A.I. is the most expensive …
ytc_UgyBGjyRX…
G
AI video about rats in dollar general… DOLLAR GENERAL was spelled wrong 🥴 DOLLR …
ytc_UgwETkBzm…
Comment
everyone seems to talk about AI will benefit the world but right now its making people lose there jobs, we have big corporates mass firing employees already and machines replacing human that prepare food and cook it, Uber drivers that be replaced by cars that can already drive themselves.. what would stop big corporations from firing truck drivers to be replaced by any vehicle that can drive itself? pilots, boat navigators ect... the real question is what are we doing RIGHT NOW to prepare for that to come when people are left with no work? these big corporates don't seem to talk about that and have no plans to give the tools to the mass layoffs with other jobs. so far big corporations have shown they just want to fill up there pockets even more
youtube
2026-02-01T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxaDg2tzzOXmwzSvo54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw0_sbTKSfcg97NQC54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBeL0_yT0QJ_63kQZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgysZGbi2A5t5xht9ul4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw14aRZjYbbTA-Bx6Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEYtCoERq1mnmReUN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyf8tXAre3ythYOfsF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwsvLGTBEw6TSRuPUx4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzKk2UiBNDrmt1u1LV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxg9pkehSZqUR-zriB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]