Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Truck driver here fully support the autonomous trucks you have to learn to grow …
ytc_UgwVsHiu5…
G
the leap in energy and resource savin in the generation of a computing platform …
ytc_UgxNJTo8i…
G
We already know what we're going to do.
Hopefully everybody remembers 20/20 th…
ytc_UgyeUVPBM…
G
@romysayah1473 That's entirely the problem. All the AI ethics policies wont matt…
ytr_UgwzYTzlG…
G
THERE ARE MANY BENEFITS TO AI. FOOLISH PEOPLE ENGAGE IN ALARMISM AND FEARMONGERI…
ytc_UgxrBaXyV…
G
No way that will happen. Tech billionaires own the government. They bought the …
ytr_Ugx4XfNSF…
G
I think using ai to make art is like telling an artist to create sth for u and g…
ytc_UgykXRXmb…
G
It’s actually confusing for me that anyone would willingly use ai to create some…
ytc_Ugx_6Jsc2…
Comment
The future is a mixbag. One day cancer will be completely cured, but at the same time, AI's technology will be used to abuse people's rights in an authoritarian society or to kill humans very effectively in a conventional warfare. Or a renegade AI computer can launch nukes by itself? It's a scarry scenario to even think about it. Mary Shelly's "Frankenstein" and The movies "T2" and "I, Robot" aren't so farfetched after all.
youtube
2025-01-06T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdbgTWrH8RMKq3gO14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEDTGzoHszlLO4q2l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzfSEp6XEO49JI_J5x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQNzXd2pk0QjsEhpx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqTAh2jazQQpIeCU14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxdc3hZFcGbRyKkKRx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzlbrhbAvYs8W3P_pZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_JtXqJlIGcBMbRAZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwydQhCmkN9-gz6fi94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOhDeTdwd3ybcdhQB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]