this post was submitted on 26 Nov 2025
-10 points (25.0% liked)

iiiiiiitttttttttttt

1498 readers
2 users here now

you know the computer thing is it plugged in?

A community for memes and posts about tech and IT related rage.

founded 8 months ago
MODERATORS
 

It just came to my mind..... An open ended question..... Can computers really think ???

you are viewing a single comment's thread
view the rest of the comments
[–] lung@lemmy.world 14 points 1 month ago (1 children)

There's a whole school of philosophy that has argued about this for ... Well forever, but especially the last 100 years, the philosophy of mind. The problem is definition: what does it mean to think. Some may argue that it requires consciousness, but then the problem of definition is what the hell is consciousness?

So on the trivial side, yes, of course computers can think, if thoughts are nothing special. Computers have states, they can react to and inspect their own states. Is that thinking? LLMs use something like neural networks modeled after the mind to generate streams of words, and encode knowledge and concepts using statistics. Is that thinking?

On the other side, well no, computers don't think because they don't have souls. Are souls real? Or maybe there's more to human thinking than just neural networks, like quantum effects? Or more complexity due to chemical biology? Is the ability to answer a question the same thing as understanding a concept (see Chinese room experiment)?

These are the questions that philosophers love to masturbate with, publish many papers on, and make no real progress towards. Definitions are funny like that

[–] TheracAriane@thebrainbin.org 1 points 1 month ago (2 children)

@lung@lemmy.world that's a nice image, philosophers masturbating πŸ˜„πŸ˜„πŸ˜„..........

But seriously, l'm amazed at how LLMs respond to my questions.

[–] bizarroland@lemmy.world 4 points 1 month ago

The trick behind it, and it is a trick, is that they have been fed billions of pages of text that contains the majority of every sentence ever written and they use math to estimate the most appropriate word-by-word response to the question from all of the other examples of text that they have to work on.

Current LLMs are incapable of creating an original combination of words( in the absolute sense). They don't make anything. They just repeat. They are stochastic parrots.

Sometimes the answer is obvious, assuming that you have all of the relevant information, that you can provide the right answer without thinking at all. And when LLMs are correct, it is because of this phenomenon, and not because they actually thought about the question, and came up with a response.

[–] trollercoaster@sh.itjust.works 1 points 1 month ago* (last edited 1 month ago)

@lung@lemmy.world that’s a nice image, philosophers masturbating πŸ˜„πŸ˜„πŸ˜„β€¦

Diogenes would occasionally do this in public. His contemporaries didn't regard the sight a nice image for some reason unknown.