Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Unplugging" AI is a metaphor. When/if this becomes relevant, it will not be as …
ytr_UgyXqGBJ_…
G
Yep, going through getting a degree for computer science with a major in AI, I t…
ytr_UgyKcKGbF…
G
I saw the real video before he was replaced with a robot. I at first thought tho…
ytc_UgylpsBnJ…
G
Hey there @PrettyBoyTrena! Thanks for sharing your thoughts on the video. We mig…
ytr_Ugw0-aBdE…
G
This is a bit confusing as Ray Kurzweil is a head of engineering at Google and h…
ytc_UgwDtOtTD…
G
Tax tax tax put a robot in a factory tax it. Tax 75 pecent of all capital on dea…
ytc_Ugxq46Njq…
G
@Mrlooseygoose eh, have you seen what ai's been creating? Some of that shit is …
ytr_UgyOj_qfW…
G
Peter "Blah blah blah diversity and inclusion blah blah blah groups are oppresse…
ytc_UgynxwfLf…
Comment
3:10 That graphic is very misleading. While it is clear, that the training of ai models can be very computationally intensive, you can’t just claim that the training for one model leads to x amount of emmisions. Models like the GPT models or Dall-E sure did lead to a lot of emmisions, but there are several applications, where the training of the ai produces orders of magnitude less CO2. The size of the model and the duration of the training matter! Also you have to consider, that after the training, so during usage, the production of emmisions are way lower. Still there can be a lot of trainings, because most of the time there are be a lot of iterations of the ai models.
youtube
Viral AI Reaction
2024-10-26T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxg4j8RInMM3RCU6Ll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvaBdEgJD50rN0UgF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxbirGWlDL2LG0wUp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy0BCgsc4jPpWPRC3x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyyxHAOC9zu0GXATax4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-lh8eplHS5sVLuKd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmemNCwHRAbCnvpVF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNKJT-PltS3_auM8B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5-WNSZyRvehNjOoN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxkDmABty6Y7s0AbPd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]