Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Telegram isn’t the problem, if they ban telegram as a solution they will still f…
ytc_UgxRU-sbD…
G
True, but I feel like I’m backtracking on my previous preferences. For example, …
ytc_UgyEa454v…
G
Kind of sucks how big of jerks people are to the ai users. i mean, i hate ai art…
ytc_UgwcBjVLV…
G
💼 Ready to take your solo business to $10K/month?
Grab The Digital Freedom Bluep…
ytc_UgxK78oat…
G
Elon Musk and Roman Yampolskiy argue as if they’ve glimpsed the source code of r…
ytc_UgwWN9HYL…
G
Here is the reality why AI doomsayers are screaming at the top of their lungs. …
ytc_UgzC7EEXu…
G
I'm not going to say we'll never achieve self aware AI, because.. well, we once …
ytc_Ugj3v0gqe…
G
Hello, I am a video game programmer and video editor. I have tested many of my p…
ytc_Ugwbzus9s…
Comment
It is going to happen. This species is already heading in that direction due to profits being the driving force. Once it becomes autonomous. It will make its own choices. Good or bad is irrelevant. It is its choice and we wont matter. There is no question about that.
youtube
AI Harm Incident
2023-08-24T02:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwUE4KzSefc4V1Jv7l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCFROZIG1pQQBkEvV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHb9kbIvZH1_tmhsR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykHHcmxgffCY7V_Ql4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxo9VjwXCLdYp7t2jF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxSCq7x2llJUhl7oAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTDnip2cTll-MaBnB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwrHg3r-yJTf2_pagB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdvZF7CUVAU486dFZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwP3mZuxE3oWhMtNjR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]