Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One day we’re gonna ask ai a question and it’s gonna tell us “google it yourself…
ytc_UgzvaOayy…
G
You know what? Damn it, I too will pledge 14billion to Nigeria. And I will give …
rdc_jzo1ile
G
Can't tell the difference if they cover the machine with some skin maybe in thei…
ytc_Ugz6AiwLK…
G
There's a great short story by Stanisław Lem written some 50 years ago about th…
ytc_UgwreLjhR…
G
I am not afraid (20 yoe working as a senior robotics engineer in perception), bu…
rdc_oae4pzl
G
I think robots are the first step to us increasing our life span substantially. …
ytc_UgweYwY6i…
G
Give a logical system illogical goals and a sense of self... but why?? We just n…
ytc_Ugx2ag7I3…
G
I think half the problem is that no one can give an assertive answer about what …
ytc_UgyRtEEB8…
Comment
Task Manager, in itself, was some 22000 lines of code. I had copies of the Express Edition languages from 2008 on my 8 tablet. 22000 is a lot of code to manage, so you really had a clear concise plan from the start. There's room for AI in terms of both standards development and user experience. Unfortunately, there's no replacement for education and with such a large code base, it creates a barrier to entry. Microsoft notations are languages unto themselves, but they're written with the curious in mind. It takes effort to master some technical aspects of asynchronous execution. Anything worth doing requires effort.
youtube
AI Jobs
2024-01-28T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwHj7RP7HmXuRnMT_B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgybvUDkznorAncHlzt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz0aszq3FSH98pgG7p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVDpkD5FY5daIcuON4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyA3kT8GwdZjNxJfxN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwns3qpydMV12r4Cj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzN3YuNW_k4YQ8OlZN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzXCjE0wiMqblFgtJl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznOhSlFDv4R2VSP0R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyYtIWyTQdRHgKFbc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]