Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's fascinating to think about how far technology has come in just a couple of …
ytr_UgyqY92s0…
G
@lamo1919 in 1901, Wilbur Wright himself said that humans would not fly within a…
ytr_Ugx67zHWU…
G
The real alignment problem isn't AI. It's 𝐮𝐬.
𝐆𝐫𝐞𝐞𝐝, 𝐞𝐠𝐨, and the race to be #…
ytc_UgyltU64F…
G
The heavy handed AI editorialising is going to be a real problem. they will all …
ytc_Ugx-hZdCu…
G
None of this will happen. If we can’t even work collectively to regulate the tec…
rdc_kitvnoa
G
I just heard today of an AI robotics place in Japan, where there were four incom…
ytc_UgwJggzF1…
G
As all advances in technology, AI too will turn out to be a productivity improve…
ytc_Ugw46TIOe…
G
Fighting AI is like those who fought moving from horses to railroads to cars, fo…
ytc_UgwnB06Mc…
Comment
As a software dev myself, I'm currently not too impressed with the quality of work that AI does. I'm impressed but not too impressed. I can feed many simple JIRA tickets to Claude and it will fix them quite OK, but any complex bugs that require actual intelligence - it is just getting lost trying random stuff.
(My company is pushing us to use AI as much as possible, so I'm always trying to fix bugs with AI first, but when it goes nowhere in 2-3 days of -max agent working - take the matter into my hands).
So I'm not to worried about SW devs losing all the jobs. (Still, got a truck driver license just in case :-))
youtube
AI Jobs
2026-03-16T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyDP585Cb8_hFCccWV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQIDkqGxq93KaOxSh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyK_pulziEPa9jQaKJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxbCDLLXf29IZtm0WF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2q7Y6-a12zinLCZd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy5DhPyl6BPUW6VxsJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyJT9gsVy10VB33rAh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwCudajEBbW45go9dl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqaRpRTa-Ld2fUR-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9D9aA8BTMJ-VQiN94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]