Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol the robot wants to go to school to learn why does it need to learn if it can…
ytc_UgiDZ3cSr…
G
This video fails to reconcile with the cost of AI: it takes massive data banks …
ytc_Ugyttjx_E…
G
I challenge the notion, that rights are formed around human needs. There is no r…
ytc_UgzWIJykc…
G
Im an artist, Im using AI for certain things, and honestly fucking good. I'm all…
rdc_jwx96r7
G
How come AI bots are either cute (albeit demonic) girls or faceless android type…
ytc_UgyR64zmB…
G
Ai will never admit when it is unsure or confused. Granted most humans won't ei…
ytc_Ugznxqv7m…
G
Thank you for your observation! It’s crucial to understand that algorithms often…
ytr_Ugzud7eaK…
G
That said Tesla has more accidents than other autonomous cars and yes there are …
ytr_UgxG4HTpg…
Comment
The problem with regulation is not only that it slows down profits (slowing down profits means going against peoples will since people will not buy things they dont want or agree with -- addiction being an exception)
The problem with regulation is that it gives government power to decide how to regulate the market. This incentivizes corruption and lobbying and gives foot in a door for companies to decide how to regulate the competition out.
Who decides what is good for society in a market that is regulated. The regulators and who regulates the regulators? No one.
Very specific regulation might work for AI but this isnt an easy problem to solve even if we had consensus that regulation is the way to go.
youtube
AI Governance
2025-06-18T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx8BKyFly3QZlYOybN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-nXsMIMiN9rkF25t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwMzDDI9aeyM3P4WEt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzA0ysf0mnTRSUStnF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzE6FkeZ3RLzkMj-hx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzFTOw7_E8HkGM1H54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkGHIiUUBRzcxV9bJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAo8WZossLseHzNWx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfzKvdmx3JXg-Kf5d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz63yaiJ31uwfEwvsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]