Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Suing the mod for spreading false information, in this case stating that my art …
ytc_UgwBmqUTc…
G
It made me smile when I saw Jquery and 2015 as antiquated things in the Legacy s…
ytc_UgxBWTv1m…
G
Small business in a small town. My 10 employees will have jobs as long as they w…
ytc_Ugzx-BMBg…
G
I'm a disabled artist. For me personally doing art and making my money with that…
ytc_Ugyx0WMr9…
G
So what can we do to not get replaced by ai if I Am learning React Js Right Now?…
ytc_UgxKdoNDS…
G
Stop being scared of AI. Do you pee your bed every night thinking about AI? Th…
ytc_Ugym-RWtr…
G
The plot twist is that most companies building A.I have heavy leftist cultures t…
ytc_Ugzl719HG…
G
Wow. Maybe someday a robot of a dear someone who passed away can be created. Wha…
ytc_UgwXfmk9W…
Comment
@InfinitaCity I'm not sure if I agree, but if I did, would you not concede that it completely depends on the capabilities or risk of the technology? For example, the safety requirements for a kitchen fork should be low because its capability for harm is low, whereas the safety requirements and security classification for a gain of function lab should be extremely high due the capability for harm. Likewise, a technology that can help terrorists to create high level gain of function labs (as Anthropic spoke about) should also be heavily regulated
youtube
AI Governance
2023-08-20T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxClpKCOca9l72V6wN4AaABAg.9pVxfb4Xki_9pX2CRyoLzz","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugxpmtj0TS1YrNnBVzp4AaABAg.9pVxZtplBtm9pW8vSBIlj-","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzU4TtPOomA81vHOlZ4AaABAg.9pVwdDd9EtM9pXYDO0bwkj","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugxhapbt_9wMHv4IR0t4AaABAg.9t_q1vcLTxO9tdVLdtki49","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_Ugy6gczKkWt9D7andhx4AaABAg.9tT64Hf1EkB9tTfUBI5b7g","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugy6gczKkWt9D7andhx4AaABAg.9tT64Hf1EkB9tXrFBpA8WG","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy6gczKkWt9D7andhx4AaABAg.9tT64Hf1EkB9tYZwygn3Qp","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxakbXidzyURU6t2Gh4AaABAg.AU4cyREleFbAU5KLeLQhZ8","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwAqLk9Mv-hiFAAxGx4AaABAg.AU2qxnHl8g4AU39zLZ3EnC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgwAqLk9Mv-hiFAAxGx4AaABAg.AU2qxnHl8g4AU5_LMJLQza","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]