Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I own an exterior cleaning company. AI won’t take my job. But it might take all …
rdc_nxpxowl
G
Tracing AI and stealing it back is a great idea! Just admit to it, dont hide it!…
ytc_UgxFNTCuQ…
G
So what happens if a robot bound by Asimov's Laws builds other robots without su…
ytr_Ugxvr9sin…
G
If an "sonscious AI" has sensors it will be able to feel pain and pleasure. Low …
ytc_Ugwq09tVr…
G
I have ridden about 500 miles in Waymo across the Phoenix area. Generally speaki…
ytc_UgyOxNV-2…
G
A big 'problem' with AI is that it tries to 'predict' answers/solutions based on…
ytc_Ugzn9wsop…
G
They say they don’t have facial recognition however the flock terms and conditio…
ytc_UgxBzrpNL…
G
Every single time I hear of these scenarios, where AI and other new technologies…
ytc_UgwRSDYs8…
Comment
Hello, where is the 3 laws of robotics?
We need something in place if they are willing to end people lives to keep itself alive or a preserve itself so it can keep on doing its job/data.
Why are they racing towards AGI then ASI? If we can't agree on normal things like being able to respect each other countries.. and boarders.
So, will it align with us.. whom will have the keys to the kingdom.. Gosh I am typing and realise these companies don't care...they just want money and power... Sadly, the AI will want it all and won't want to share... even if we had enough resources...to share with them..
youtube
AI Harm Incident
2025-07-24T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzqsx83skliS7pJ8iZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCTRIlx6FsRPbfegV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHOwmIFg2kZnPJXUF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwXxPClNEIaI0ggpmN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy9pk1-lt1y_v7g4Mx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZ8Lhm23yXQFaKz1N4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVUIrasnv4RcL81ud4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxyTxwSdG6aeuxII3J4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3G1woQ9FZ2ucPJZJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqQGxI0LLp87IPxn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]