Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI may not turn malicious on its own, but it would be more likely that AI ma…
ytc_Ugw-KhU2c…
G
Humans make thousands of mistakes. Sometimes unforgivable. Atleast AI is not cor…
ytc_Ugz1_MSV7…
G
Ai taking over all jobs because of one simple fact. You cannot have people on un…
ytc_Ugwfkpl9S…
G
I hatet how gringos.... say this long pronunciation at the end....16:29 "U.S eco…
ytc_UgyNx-48F…
G
if AI is so smart, why can't it solve issue like world problem with food distri…
ytc_UgwxWzuaW…
G
Obviously AI CEOS are really bad. they will be so good at their job that all com…
rdc_jsz26ks
G
AI is massively overhyped. Its is a mass delusion that is reinforced by companie…
ytc_Ugzt7PR65…
G
No the point of the laws is to be the 5 things a robot literally cannot disobey.…
ytr_UggZ2V9k5…
Comment
Well that was stupid, chatGPT obviously just tried to reach you to a point of physically being able to achieve what you asked to. While you actually changed your question to "explain how its possible", however GPT didnt forget the main request. If youd just ask how's its possible without the request of doing it, it would do better and just say that if you actually move half way each time - it would take infinity, tho in reality you dont really move half way so its finite.
youtube
2025-05-19T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx_GCTgmqW8KJ5AEwx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7Y3Wa15WdXXtcDGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_UgwsNb1SfkrTCvGVYMN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMUXiglhrPdfn2fVB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_UgxAYQIbYbuZ8Be2NGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxtmYXe41s_SB_gFBd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZ5xt6U_8JXhPBz2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx60hNmfciYoDSSHZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBt3YgnJGLbH8pwCd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_xYcxPH51881ZAWt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]