this post was submitted on 26 Dec 2024
2 points (75.0% liked)

Technology

68689 readers
21 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Thanks to @[email protected] for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 3 months ago (2 children)

You are confusing input with throughput. They agree that the input is much greater. It's the throughput that is so slow. Here's the abstract:

This article is about the neural conundrum behind the slowness of human behavior. The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ∼1⁢0^9^ bits/s. The stark contrast between these numbers remains unexplained and touches on fundamental aspects of brain function: what neural substrate sets this speed limit on the pace of our existence? Why does the brain need billions of neurons to process 10 bits/s? Why can we only think about one thing at a time? The brain seems to operate in two distinct modes: the “outer” brain handles fast high-dimensional sensory and motor signals, whereas the “inner” brain processes the reduced few bits needed to control behavior. Plausible explanations exist for the large neuron numbers in the outer brain, but not for the inner brain, and we propose new research directions to remedy this.

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago) (1 children)

He's not.

Executive function has limited capacity, but executive function isn't your brain (and there's no reasonable definition that limits it to anything as absurd as 10 bits). Your visual center is processing all those bits that enter the eyes. All the time. You don't retain all of it, but retaining any of it necessarily requires processing a huge chunk of it.

Literally just understanding the concept of car when you see one is much more than 10 bits of information.

[–] [email protected] 0 points 3 months ago (1 children)

I think that we are all speaking without being able to read the paper (and in my case, I know I wouldn't understand it), so I think dismissing it outright without knowing how they are defining things or measuring them is not really the best course here.

I would suggest that Caltech studies don't tend to be poorly-done.

[–] [email protected] -1 points 3 months ago* (last edited 3 months ago) (1 children)

There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.

There are hundreds of systems in your brain that are actively processing many, many orders of magnitude more than ten bits of information per second all the time. We can literally watch them do so.

It's possible the headline is a lie by someone who doesn't understand the research. It's not remotely within the realm of plausibility that it resembles reality in any way.

[–] [email protected] 0 points 3 months ago (1 children)

There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.

That is quite the claim from someone who has apparently not even read the abstract of the paper. I pasted it in the thread.

[–] [email protected] -1 points 3 months ago* (last edited 3 months ago) (1 children)

It doesn't matter what it says.

A word is more than 10 bits on its own.

[–] [email protected] 0 points 3 months ago (1 children)

You know, dismissing a paper without even taking a minute to read the abstract and basing everything on a headline to claim it's all nonsense is not a good look. I'm just saying.

[–] [email protected] -1 points 3 months ago* (last edited 3 months ago) (1 children)

The point is that it's literally impossible for the headline to be anything but a lie.

I don't need to dig further into a headline that claims cell towers cause cancer because of deadly cell signal radiation, and that's far less deluded than this headline is.

The core concept is entirely incompatible with even a basic understanding of information theory or how the brain works.

(But I did read the abstract, not knowing it's the abstract because it's such nonsensical babble. It makes it even worse.)

[–] [email protected] 0 points 3 months ago (1 children)

Again, refusing to even read the abstract when it has been provided for you because you've already decided the science is wrong without evaluating anything but a short headline is not a good look.

In fact, it is the sort of thing that people who claim cell towers cause cancer are famous for doing themselves.

[–] [email protected] -1 points 3 months ago (1 children)

The headline is completely incompatible with multiple large bodies of scientific evidence. It's the equivalent of claiming gravity doesn't exist. Dismissing obvious nonsense is a necessary part of filtering the huge amount of information available.

But I did read the abstract and it makes the headline look reasonable by comparison.

[–] [email protected] 0 points 3 months ago (1 children)

I don't suppose it would be worth asking if your professional field was neurology...

[–] [email protected] -1 points 3 months ago (1 children)

Argument to authority doesn't strengthen your argument.

A piece of paper is not a prerequisite to the extremely basic level of understanding it takes to laugh at this.

[–] [email protected] 0 points 3 months ago (1 children)

So essentially what you are saying is that you have no expertise in neurology and have not read the paper or evaluated any of the data or the methodology and yet, despite all of that, you know for certain that it is wrong.

Please explain your certainty. And if you appeal to "common sense," please note that common sense is why people thought the sun orbited the Earth for thousands of years.

[–] [email protected] -1 points 3 months ago (1 children)

No, I am saying that I do have a meaningful working knowledge of how the brain works, and information theory, beyond the literal surface level it would take to understand that the headline is bullshit.

You don't need to be a Nobel prize winning physicist to laugh at a paper claiming gravity is impossible. This headline is that level. Literally just processing a word per second completely invalidates it, because an average vocabulary of 20k means that every word, by itself, is ~14 bits of information.

[–] [email protected] 1 points 3 months ago (1 children)

You are already not using 'bit' the way it is defined in the paper. Again, not a good look.

[–] [email protected] -1 points 3 months ago (2 children)

The paper is not entitled to redefine a scientific term to be completely incorrect.

A bit is a bit.

[–] [email protected] 0 points 3 months ago (1 children)

And now it's "it's the paper's fault it's wrong because it defined a term the way I didn't want it defined."

[–] [email protected] 0 points 3 months ago (1 children)

Yes.

Science is built on a shared, standardized base of knowledge. Laying claim to a standard term to mean something entirely incompatible with the actual definition makes your paper objectively incorrect and without merit.

[–] [email protected] 1 points 3 months ago

Cool. Let me know when you feel like reading the paper since Aatube already showed you they are using it properly. Or at least admitting you might not know as much about this as you think you do...

[–] [email protected] 0 points 3 months ago

From a cursory glance it seems at least quite close to the definition of a bit in relation to entropy, also known as a shannon.

Nevertheless, the term bits of information or simply bits is more often heard, even in the fields of information and communication theory, rather than shannons; just saying bits can therefore be ambiguous. Using the unit shannon is an explicit reference to a quantity of information content, information entropy or channel capacity, and is not restricted to binary data, whereas bits can as well refer to the number of binary symbols involved, as is the term used in fields such as data processing. —Wikipedia article for shannons

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago) (1 children)

You are confusing input with throughput.

No I'm not, I read that part. Input is for instance hearing a sound wave, which the brain can process at amazing speed, separating a multitude of simultaneous sounds, and translate into meaningful information. Be it music, speech, or a noise that shouldn't be there. It's true that this part is easier to measure, as we can do something similar, although not nearly as well on computers. As we can determine not only content of sounds, but also extrapolate from it in real time. The sound may only be about 2x22k bit, but the processing required is way higher. And that's even more obviously way way way above 10 bit per second.

This is a very complex function that require loads of processing. And can distinguish with microsecond precision it reaches each ear to determine direction.
The same is the case with vision, which although not at all the resolution we think it is, requires massive processing too to interpret into something meaningful.

Now the weird thing is, why in the world do they think consciousness which is even MORE complex, should operate at lower speed? That idea is outright moronic!!!

Edit:

Changed nanosecond to microsecond.

[–] [email protected] 1 points 3 months ago (1 children)

As I suggested to someone else, without any of us actually reading the paper, and I know I do not have the requisite knowledge to understand it if I did, dismissing it with words like "moronic" is not warranted. And as I also suggested, I don't think such a word can generally be applied to Caltech studies. They have a pretty solid reputation as far as I know.

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago) (1 children)

I'm not fucking reading a paper with such ridiculous claims, I gave it a chance, but it simply isn't worth it. And I understand their claims and argumentation perfectly. They simply don't have a clue about the things they make claims about.
I've been investigating and researching these issues for 40 years with an approach from scientific evidence, so please piss off with your claims of me not understanding it.

[–] [email protected] 1 points 3 months ago

Without evaluating the data or methodology, I would say that the chance you gave it was not a fair one. Especially since you decided to label it "moronic." That's quite a claim.