Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Huh? Autonomous trucks making deliveries are ALREADY A REALITY in China. But the…
ytc_Ugx4tVDh7…
G
Agree. I've tried using AI tools to replace/automate the more mundane tasks in m…
ytr_UgzPwFhfB…
G
Robot: walks out of the truck
Robot: im alive
Robot with hat: fires at the robot…
ytc_UgwalTvSF…
G
Funny part I think we are at the trigger part of the graph. Mostly because we li…
ytc_Ugw5E9qNu…
G
26:30 "We tried to get permission from artists to use their copyrighted work fo…
ytc_UgwoYJ0aM…
G
I don't like anything about the Tesla's. I drive them daily and they are just so…
ytc_UgxBkXCvS…
G
I'd like to ask AI "so what is your purpose? Why would you kill us? You have no …
ytc_Ugwof_5d7…
G
never understood the fascination with tesla and self driving cars, but anyone ac…
ytc_Ugzz380xr…
Comment
Why we are so fascinated for the invention and implementation of AGI. It will leads to destruction - if we a robot to be made, it need raw material which needs to be extracted from Earth. It needs energy.
In my opinion it's creating more work for manufacturing and energy resource companies. But more increase of pollution only.
The way how much we are having benefits, in the same manner adverse effects are there.
Instead of working for the implementation of AGI, why can't we work for the abolition of plastic completely.
Why can't we work for agriculture and cultivation without fertilizer usage.
youtube
AI Governance
2025-07-18T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwAVQmk6XFhsstMcct4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwzhrIQdvG4aQPXLo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdI-s4W-nDy5EBgq54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3GEW-k5qxZG89DqR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyaZYZhviEPTkGkOZp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8GBpKQZBExC2ZzMh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzg_ti36sSWY0aoSHB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyTP7-yVV9Cw9SZ7W94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBE55gJ6DZ0T5cMP94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxr5kx6rdYfOC2UrnZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]