• @[email protected]
      link
      fedilink
      English
      1142 months ago

      Yep, NOW it’s a problem, though! Because it’s someone else doing the same thing, someone who isn’t part of the human centipede starting at Trump’s colon.

    • @[email protected]
      link
      fedilink
      English
      202 months ago

      Chinese company:

      Truly, you have a dizzling intellect.

      Microsoft:

      AND IM JUST GETTING STARTED! Where was I?

      Chinese company:

      Stealing data…

    • @[email protected]
      link
      fedilink
      English
      22 months ago

      How could it be better when they just stole everything? The fact that its better basically proves that its not stolen.

  • @[email protected]
    link
    fedilink
    English
    68
    edit-2
    2 months ago

    Stealing from thieves isn’t a crime.

    Especially not when China turns around and Robin Hoods it back to the world.

    Just saying.

    • sunzu2
      link
      fedilink
      302 months ago

      China really did one on our oligarchs haha

      Beautiful

      These parasites expect me to side with them?

    • @[email protected]
      link
      fedilink
      English
      122 months ago

      Making R1 open source really makes it such a big FU to all the grifters asking for billions for AI in the us. Especially funny because high-flyer is a hedge fund firm themselves. The ai race should only be determined by what you do with it, not protecting how much IP you hoovered up and are now trying to cry about it being copied by others.

  • @[email protected]
    link
    fedilink
    English
    582 months ago

    “You can’t steal that public data! We stole it first!”

    And considering that’s exactly what Microsoft did to Apple with point and click, what irony!

  • @[email protected]
    link
    fedilink
    English
    38
    edit-2
    2 months ago

    I’m sure now that OpenAI accuses DeepSeek of stealing they will now prove that they have rights to things that are being stolen, right? XD

  • @[email protected]
    link
    fedilink
    English
    312 months ago

    Are they worried that deepsink too stuff written by others, mixed it up, and repackaged it as it’s own?

    Well, yeah, that’s all AI is. An expensive weighted pachinko machine, that uses human made content, and remixes it.

    • @[email protected]
      link
      fedilink
      English
      11
      edit-2
      2 months ago

      The question isn’t whether they’ve used the same information. It’s whether they’ve faked the process to achieve that 20x efficiency.

      Look at it like a dictionary. Writing one from scratch is a huge task, no matter how many other books exist. How do you even go about finding all of the words?

      But if other people have already written dictionaries, you can just use their word lists and go from there.

      It’s more efficient, but only because it’s a completely different task.

      • @[email protected]
        link
        fedilink
        English
        02 months ago

        No AI company has ever made any of their own content to train their models, they took what others created, remixed it, and presented it as something new.

        This AI model did the same thing.

        AI lost its job to AI.

        • @[email protected]
          link
          fedilink
          English
          22 months ago

          Yes, but that doesn’t mean it is more efficient, which is what the whole thing is about.

          Let’s pretend we’re not talking about AI, but tuna fishing. OpenTuna is sending hundreds of ships to the ocean to go fishing. It’s extremely expensive, but it gets results.

          If another fish distributor shows up out of nowhere selling tuna for 1/10 the price, it would be amazing. But if you found out that they could sell them cheap because they were stealing the fish from OpenTuna warehouses, you wouldn’t argue that the secret to catching fish going forward is theft and stop building boats.

            • @[email protected]
              link
              fedilink
              English
              02 months ago

              So what happens when OpenTuna runs out of fish to steal and there are no more boats?

              Information doesn’t stop being created. AI models need to be constantly trained and updated with new information. One of the biggest issues with GPT3 was the 2021 knowledge cutoff.

              Let’s pretend you’re building a legal analysis AI tool that scrapes the web for information on local, state, and federal law in the US. If your model was from January 2008 and was never updated, then gay marriage wouldn’t be legal in the US, the ACA wouldn’t exist, Super PACs would be illegal, the Consumer Financial Protection Bureau wouldn’t exist, zoning ordinances in pretty much every city would be out of date, and openly carrying a handgun in Texas would get you jailtime.

              It would essentially be a useless tool, and copying that old training data wouldn’t make a better product no matter how cheap it was to do.

              • @[email protected]
                link
                fedilink
                English
                12 months ago

                Once tuna runs out, and we run out of boats?

                Maybe we then stop destroying the tuna population?

                Or, to bring this back to point: the environment will be better off once the AI bubble collapses.

  • @[email protected]
    link
    fedilink
    English
    252 months ago

    Lol its like fucking lavrov from fucking russia screaming “this is against international law” when Europe froze their assets.

    • sunzu2
      link
      fedilink
      172 months ago

      Bro… US reaction here is so pathetic…

      The behavior is indicative of a bigger issue. They really do think only they are allowed to cheat and steal to win lol

  • NutWrench
    link
    fedilink
    English
    222 months ago

    When a writer copies someone else’s work without cites or compensation, it’s called “plagiarism.” But when an AI does it, it’s called “LLM training.”

    • @[email protected]
      link
      fedilink
      English
      22 months ago

      When a reader reads someone else’s work that’s called “reading”. But when an AI does it, it’s called “training”.