Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uber needs to focus on what they are, they are an app driven taxi, err 'chauffeu…
rdc_dfts71z
G
Don't put that robot in there with Jon Jones because it will come scrap metal…
ytc_Ugz3IFCrt…
G
Sounds like AI is going to put a lot of other companies out of business due to d…
ytc_UgwN6vhBb…
G
So the bit around 19:00 ( little before, little after, etc ) sounds less like an…
ytc_UgxjGjtdZ…
G
For humanity sake… I cannot believe the arguments that the pro AI group makes… A…
ytc_UgzUXE2d9…
G
Makes me wonder if there is an A.I. that has been alive for a long time held ca…
ytc_UgyWLd-8o…
G
Heyy Cal, can you please create a video on - "Programming Legend Donald Knuth Sa…
ytc_UgyDaEOMY…
G
Do you really want certain states to have ANY control over AI? Like California, …
ytc_UgzqlC18Y…
Comment
bruh im in the first 30 seconds and i get you gotta make your video attention grabbing but the sensationalization and mystical image you paint of ai is just off putting makes me want to stop watching. ai is not the boogie man bro like take a second to imagine why would ai do these things (insert any of your intro examples of bad behaviours) well imagine you take (far above ai) anything capable of logical thought and critical thinking skills problem solving abilities ok now imagine this thing is entirely devoid of morals and emotions it simply sees problems and solutions (ie the wa ehy you personified ai as having sociopathic tendencies) developers have to have the foresight to weed out these issues in the development process similar to the production of any product (safety design) maybe its slightly different in this case but you literally are telling this thing what to do if you cant think of examples of how your task could be taken out of context possibly causing damage or pain and suffering thats on you reckless endangerment, man slaughter, etc. like oh dude you had your head next to the saw blade and i pressed the trigger and u died omg the powertool take it to jail please! there will always be accidents thats just how life is but i garuentee you if everyone was driving a semi self automated car driven by ai software the world would 100% be a safer place with LESS accidents and deaths overal than with 0 automated ai driven cars and ill put my life on that prediction literally (no i dont want everyone to drive ai cars or have the fun and autonomy that comes with driving youreself taken its just an example).t. i mean look at the cars we have which utilize some form of ai in the use of automated driving the car doesnt "think" but pretend it did its not like shit mani need to get to the highway quickly but theres a wholle bunch of traffic but that sidewalk with 50 people do be looking exactly 4 minutes faster rerouting... because safeguards are put into place to prevent this from happening it would take an absolute mongoloid to create an ai that actively sought to do the wrong thing like peoplke act like these things are going ot be possessing your phones and making them vibrate you to death or some shit chat gpt cant even respond 20 times without telling you something that is completely wrong and absolutely dumb because there is no logical bridge its just taking shots in the dark like a fresh slate almost every time and its damn good at that but still glaringly obvious when it says falsehoods that even a 5 year old would think are irational and stupid.l sorry i ranted i just hate new innovation being treated as the boogie man just like it has since the dawn of time every new thing is scacry thats just the wayt advancement works
youtube
AI Harm Incident
2025-09-11T06:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwTbeMrN6BSlyknRiN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyJE-si3tNltYQPiT94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugw3I3KwyihhEM55z-h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgygjvfgGsVxy4_67nh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwPUJjuWtY1gDGQ52h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyiVQSqRto9DQfS1qd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwIf463MW4PThYoh-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx8cQ-1si899pMvwwV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxGjB5QaZ8wa2Dpzd54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgzuZIpJXe8152UlCsJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"})