People who make low effort AI bashing commentary videos, but it’s just them reading a script over some stock footage. You shouldn’t be making worse content than an AI my guy.

    • Hackworth@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      12 days ago

      Algorithms that value engagement over quality are the bigger problem. Stock footage and generative AI are both fine and basically unrelated to this problem.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        12 days ago

        If you dislike vapid slop that’s designed to maximize adherence to opaque and fickle metrics, you might wanna reconsider whether gen AI is fine and unrelated to the problem.

        We’re seeing the genesis of the information equivalent of Kessler Syndrome here. Toxic promotion algorithms are quaint, comparatively.

        Edit: Fwiw, I did not downvote you. Have a good day, friend.

        • Hackworth@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          12 days ago

          So I’ve been producing video professionally for ~25 years, and judicious use of gen ai allows me to do some things that I wouldn’t have the time/resources to do otherwise. As a simple example, Premiere’s generative extend will add a few seconds to the end of (basically) any video clip (basically) seamlessly. Often that’s all the pad I need to improve a cut. The alternatives (re-shoots) are expensive, time-consuming, and approved on a need basis.

          Many of the same concerns about the market being flooded with low quality content were raised with the advent of video and again with digital video and again with HD video. The barrier to entry for film is high; for video, it’s virtually non-existent. But I don’t think anyone would claim today that video was a bad idea. AI is in some ways the same kind of democratization of production technology.

          I’d be remiss if I didn’t address the ways in which it’s not similar. We can set up a completely automated workflow right now that will quickly generate YouTube “content” and probably make a profit. We could do this before gen AI, but not with such hallucinatory gusto. YouTube is currently being flooded with this crap. But just like people left Twitter (or reddit) when it became overrun by bots, people aren’t going to stick around for your platform full of AI content (at least not until it’s much better).

          The IP side of this is mostly funny to me. They’re already talking about a “post-plagiarism” world in academia. I don’t see how copyright survives gen AI at all long-term, frankly. As an artist who saw his first distributed feature film on pirate bay the same day - it just doesn’t bother me. I’ve only ever really gotten paid to do specific work for a client. I don’t expect to get paid for things I make to express myself artistically.

          But I hate that I’m shackled to Adobe for a variety of other reasons, and if someone has a good suggestion for an open source alternative to After Effects, I’m all ears.

          Edit: No worries. TLDR; YouTube’ll hopefully be forced to alter the toxic algorithms to at least better filter out purely automated shlock.

          • kibiz0r@midwest.social
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            12 days ago

            I can understand that take, but to me the more relevant comparison is the fossil fuel boom.

            Like the advent of more powerful creative tools (camera, printing press, etc.), fossil fuels allowed us to do what we were already doing but faster.

            Unlike a camera, though… Coal, oil, and gen AI all have to pull raw material from somewhere in order to operate, and produce undesired byproducts as a result of their operation.

            In the early days of fossil fuel, it must been impossible to even conceive the thought that there might be limits to how much we could safely extract raw materials or dump hazardous residual crud.

            From one person’s perspective, the world seems so impossibly large. But it turns out, there are limits, and we were already well on our way to exceeding them by the time we realized our predicament.

            I think we’re sprinting towards discovering similar constraints for our information systems.

            It won’t be exactly the same, and much like climate change I don’t think there will be a specific minute of a specific day where everything turns to shit.

            But I think there are instructive similarities:

            • The most harmful kinds of gen AI, as with fossil fuels, will have the highest ROI.
            • There will be safe, responsible ways to use it, but it will be difficult to regulate from the top down and full of perverse incentives to cheat from the bottom up.
            • It will probably continue to accelerate even as the problems become more noticeable and disruptive.
            • It will be next to impossible to undo the damage at a significant scale.

            Edit: Yeah, I’m also not too sympathetic to the death of copyright. It’s long overdue. The original conception of copyright was botched from the start – more about capital and censorship than about anything for the good of all humanity. But still, there are two good things that I see about copyright that I worry about losing in a “post-plagiarism future”:

            1. Accurate attribution

            I like knowing where an idea came from, whether it’s an artistic concept or formal information. It’s nice to know where I should go to continue the conversation and see more thoughts that came from a similar source.

            1. Faithful replication

            I think it’s important that ideas keep their same general shape when they’re copied around, at least when the copier isn’t deliberately intending to modify them. Things like satire can easily be mistaken for earnest content if you tilt your head 5 degrees to the side. I don’t want the discrete bits and bobs of that kind of content to get sucked up and recycled to make arguments for the very thing the original author was arguing against.

            Edit 2: I remembered there’s a Google Tech Talk from ages ago, where the speaker argues copyright is unnecessary to preserve these two attributes in the digital age, because search engines are so reliable that you’ll never be confused about the origin of a work, and computers only make exact copies of things so there’s no risk of degrading a work over time. I think those arguments have not aged well, lol.

            • Hackworth@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 days ago

              Oh, I was really just focusing on video production and entertainment in the next few years. If we’re talking AI’s influence on the information landscape as a whole and humanity in general: I think we’ve discovered how to make a spark- maybe how to gather kindling. We’ll have this fire thing figured out soon, and then who knows what happens. I have no doubt that too much is going to get burned as we learn the dangers and limits of the flame. But civilization awaits us if we survive. I dunno what that means for this technology. Like really, I can imagine so many seemingly equally plausible 2050s that I can’t plant a flag in one. From utopian to dystopian to down right mediocre, I wouldn’t know where to place a bet.

    • Duamerthrax@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      11 days ago

      Low effort content and padding the presentation out to maximize ads. I use to automatically discard videos that were around 11 minutes long before signaled that it was optimized for ads, the YouTube changed their policy at some point.