this post was submitted on 03 Mar 2026
560 points (99.0% liked)

Technology

82745 readers
2821 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] deltaspawn0040@lemmy.zip 105 points 2 weeks ago (6 children)

Controversy... What controversy? It sounds more like blatant journalistic malpractice

[–] Khanzarate@lemmy.world 28 points 2 weeks ago

A few years ago, blatant journalistic malpractice was a controversy.

[–] protist@mander.xyz 17 points 2 weeks ago (1 children)
load more comments (1 replies)
[–] artyom@piefed.social 11 points 2 weeks ago (3 children)

When I suggested he be fired on another thread I received several responses saying "he made a mistake" and "he was sick", and many downvotes in return.

[–] XLE@piefed.social 10 points 2 weeks ago (5 children)

The comments here around this were so... Off. I guess nothing was certain, but we were supposed to believe that the author was too sick to write an article, but also writing an article and using an AI "tool" at the same time.

Hindsight is 20/20, but popular defenses at the time were

He wrote the article himself, he just got mixed up when experimenting with using an AI tool to help him extract quotes from a blog entry. (He is the head AI writer, so learning about these tools is his job.) It was nonetheless his failure to check the quotes he was copying from his note to make sure that he got them right… but an important bit of context is that he had COVID while doing all this.

I was the one who wrote that comment, and it was not an attempt to excuse all of his actions but a response to the following comment:

Someone deserves to be fired. Just imagine you’re paying someone to do a job and they just 100% completely outsource it to a machine in 5 seconds and then goes home.

Here is the full comment that I wrote, including the part you snipped off at the end:

He wrote the article himself, he just got mixed up when experimenting with using an AI tool to help him extract quotes from a blog entry. (He is the head AI writer, so learning about these tools is his job.) It was nonetheless his failure to check the quotes he was copying from his note to make sure that he got them right… but an important bit of context is that he had COVID while doing all this. Now, arguably he should have taken sick time off instead of trying to work through it (as he admits), but this would have cost him vacation time, and the fact that he even was forced into making this choice is a systemic problem that is not being sufficiently acknowledged.

load more comments (4 replies)
[–] totally_human_emdash_user@piefed.blahaj.zone 7 points 2 weeks ago (4 children)

I did not downvote you—my instance does not allow or show downvotes, which is really nice!—but he was sick, and he did make a mistake, and him being fired does not make either of those things false.

Also, a ton of people were piling on him in that thread, so you had plenty of company in calling him to be fired.

load more comments (4 replies)
[–] deltaspawn0040@lemmy.zip 6 points 2 weeks ago (1 children)

Amazing. Just great.

Imagine being confronted for lying and just going "hey it was an accident okay I didn't MEAN to decieve people, I just used the machine known for deceiving people and willingly put my name on its deceptions and it deceived people!" and having people defend you.

[–] totally_human_emdash_user@piefed.blahaj.zone 11 points 2 weeks ago (1 children)

Actually, he completely admitted to and took full responsibility for his mistake; at no point did he offer an excuse, only an explanation.

To the extent I was defending him, it was because people insisted on painting him in the worst possible light, and on misinterpreting his explanation as an excuse, not because I think that everything that he did was okay.

[–] deltaspawn0040@lemmy.zip 7 points 2 weeks ago* (last edited 2 weeks ago)

You do have a point, after reading the article. That's a bit embarrassing for me, honestly. Ragebait got me again, it seems...

load more comments (3 replies)
[–] tidderuuf@lemmy.world 74 points 2 weeks ago (3 children)

I'm not taking all the credit but I do hope those people who didn't believe me in the past could rightfully take this comment, print it, pull down their pants and shove it up their ass.

It's time to hold journalism with a higher standard and this idea that "well they do alright" and "it was only once" is bullshit sliding into madness.

Just the facts, folks.

[–] just_another_person@lemmy.world 31 points 2 weeks ago* (last edited 2 weeks ago) (5 children)

The problem with your attitude towards this is that these companies are forcing "AI" down everyone's throat. It's a requirement now to churn out more bullshit than humanly possible.

This person was simply fired because they didn't catch the false information, and not because they used the tools forced upon them.

[–] mrmaplebar@fedia.io 63 points 2 weeks ago (1 children)

To be fair to Ars Technica, that doesn't sound like the case to me.

The "journalist" in question seems to be suggesting that this was their own bad judgment to use AI to "find relevant quotes" from the source material.

Having said that, there's also a senior editor on the by-line who hasn't been held accountable for clearly failing to do their job, which as I understand it, is to read, edit and verify the contents of the article. So in a way Ars seems to have a problem with quality whether or not the use of AI was mandated.

[–] just_another_person@lemmy.world 30 points 2 weeks ago (3 children)

Ars is owned by Conde Nast who has multiple whistleblowers saying AI is being forced on them. Think that's kind of relevant.

[–] protist@mander.xyz 12 points 2 weeks ago

Is there any evidence this is happening at Ars Technica? They're pretty transparent about their methods, and obviously tech-savvy. Just because it happened at Teen Vogue doesn't mean it's happening at Ars. Conde Nast publications seem to be run pretty independently. Take The New Yorker, their content remains amazing and seems fully independent.

[–] Railcar8095@lemmy.world 6 points 2 weeks ago

Most companies have AI forced, either directly or indirectly ("you need to double your output, AI can help..." kind of thing)

load more comments (1 replies)
[–] MountingSuspicion@reddthat.com 11 points 2 weeks ago

I don't work at Ars, and maybe you know something I don't, but I have seen nothing to suggest that they're one of the companies doing that. It seems like they are pretty open about how they do not allow AI to be used in the process. Have they said something to indicate otherwise and I just misssed it?

[–] ExcessShiv@lemmy.dbzer0.com 8 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Sifting through information to find out what's true and what's not, before presenting it to the public, is a pretty crucial task and ability for an actual journalist though. It is probably one of the most important parts of their job to verify the correctness of their sources and what they write regardless of whether or not they use AI tools.

[–] just_another_person@lemmy.world 5 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Then maybe they shouldn't be using these tools in the first place. Other Conde Nast employees have already been blowing the whistle about this, which is funny because they sued all the AI companies for stealing content.

Whether there is a news article about it or not, these shitty tools are being shoved down everyone's throats. From developers, to authors.

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)
[–] MagnificentSteiner@lemmy.zip 23 points 2 weeks ago

Main character moment.

[–] Kissaki@feddit.org 10 points 2 weeks ago (9 children)

and “it was only once” is bullshit

They checked and then fired the author. I don't see how this is "it was only once" implying nothing changed and it will happen again. Isn't firing the author "holding journalism to a higher standard" already, which you ask for?

load more comments (9 replies)
[–] paequ2@lemmy.today 32 points 2 weeks ago (2 children)

Whoa. There are actually consequences? ArsTechnica is actually sorry??

[–] Blue_Morpho@lemmy.world 34 points 2 weeks ago (3 children)

No, the worker was fired and the executive whose job title is making sure that the work submitted is correct was not fired.

The executives will get a bonus this year.

[–] echodot@feddit.uk 9 points 2 weeks ago

Copy editing won't be an executive's job. But yeah, they didn't do the bare minimum which is concerning, it seems to indicate that they may not do the bare minimum on all of their articles. How much stuff went undiscovered?

I'm not going to outright say that journalist shouldn't use AI to write articles, because it's basically an enforceable rule, but there should be someone at some point whose ultimate responsibility is to make sure that the articles are at least factual, whether they were written by a human or not. Determining whether a quote is legitimate is pretty easy, you just have to Google the quote, if you can't find any other sources you start to ask questions. As I said it's the bare minimum they could have done.

[–] WhyJiffie@sh.itjust.works 5 points 2 weeks ago

The executives will get a bonus this year.

well of course! they just saved a lot of money on wages, they deserve it!

load more comments (1 replies)
[–] nutsack@lemmy.dbzer0.com 5 points 2 weeks ago

only if it goes viral

[–] kieron115@startrek.website 27 points 2 weeks ago

Journalistic integrity? On my internet? Well I never.

[–] Kissaki@feddit.org 16 points 1 week ago

"futurism has confirmed". Later on the article: "reached out to three parties, no replies and no comment".

Huh? So how did they confirm?

[–] bstix@feddit.dk 10 points 2 weeks ago

"I ain't never said no such thing" - Albert Einstein

[–] jtrek@startrek.website 8 points 1 week ago

Seems fair. Was a pretty big fuck up. Might deter others from making similar fuck ups.

[–] Fedizen@lemmy.world 6 points 2 weeks ago

As they should

[–] nutsack@lemmy.dbzer0.com 5 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I would fire them and hope that they are blacklisted from ever working in journalism ever again

[–] rodneylives@lemmy.world 11 points 1 week ago (8 children)

I've interacted with Benj Edwards on social media for some time. He's done lots of good work! He's on (or maybe used to be on) Mastodon and Bluesky. He runs Vintage Computing and Gaming, and has written good articles for several prominent places. I've said as much in multiple forums, I feel like I've maybe been going on a crusade.

I haven't seen many others defending him. I'm really torn up over this. They had a weak moment. They were sick (I mean, literally.). A few other people, notably Cory Doctorow and Paul Ford, have written LLM-defending places. And the AI hype has been deafening.

It's amazing though, that so soon after he used AI, that it immediately hallucinated something job-ending. I knew it was really bad, but I didn't know it was THAT bad. You get the sense, with so many people talking positively about it, that the hallucinations must be something that happens, what, maybe 5% of the time?

To me, it seems like the kind of mistake that he should be able to apologize for, promise not to do it again, and move on. But we've all had our good will taken advantage of for so long by malicious actors, like how Gamergate was used as a wedge to push loathsome politics onto a legion of young males. It feels like we can't give anyone the benefit of the doubt any more.

I don't know. I know I'm influenced by all the good work he's done. I feel like that shouldn't all be thrown away.

[–] partofthevoice@lemmy.zip 9 points 1 week ago

5% of the time? LLMs, from their own perspective, are only capable of hallucinating. There’s no difference in what they’re doing between cases we call “hallucinating” and “correct.” It’s the same process.

load more comments (7 replies)
load more comments
view more: next ›