this post was submitted on 03 Feb 2026
125 points (94.3% liked)

Technology

80478 readers
3609 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] RedWeasel@lemmy.world 49 points 1 day ago* (last edited 1 day ago) (13 children)

So, around 1947. Took about 14 years to get to being able to put into chips. So another decade and a half?

Edit: and another 15 to 25 years after that for it to be in consumer households?

[–] kutt@lemmy.world 8 points 15 hours ago (7 children)

I don’t think it will ever reach consumer households, since it requires extremely complex and expensive materials, tools and physical conditions. Unless a major breakthrough occurs but highly unlikely.

Also we don’t really have a use for them, at least to regular users. They won’t replace classical computers.

But you can already access some QCs online. IBM has a paid remote API for instance.

[–] RedWeasel@lemmy.world 1 points 14 hours ago (2 children)

I can currently only see them used as accelerators of some type right now. Could see them used potentially for GPUs, but generally I suspect some form of compute first. GenAI anyone? SkyNET? But that is only if they can be made portable for laptops or phones which is still a major issue still needing to be addressed.

I don't expect them to replace traditional chips in my lifetime if ever.

[–] kutt@lemmy.world 2 points 13 hours ago

Yes they will probably never replace them because they’re actually slower than classical computers in doing simple calculations.

Quantum ML is actively being researched. However I am not informed at all about the advancement in this field specifically.

But the good news is that it doesn’t need to be portable, we can use them just as we do right now with remote access!

load more comments (1 replies)
load more comments (5 replies)
load more comments (10 replies)