177 comments

  • nis0s a few seconds ago

    I think the discussion on serial vs parallel processing is incomplete in the linked paper, and is one reason I think the 10 bits/s bottleneck is an incomplete or misinterpreted result. Here’s a review with sources on serial processing, https://journalofcognition.org/articles/10.5334/joc.185

    > Cognitive psychology has mainly focused on structural and functional limitations of cognitive processes when facing multitasking requirements. Structural limitations assume strict serial processing for at least one processing stage, while functional limitations assume flexible, parallel processing only limited by the number of available resources. Human movement science, on the other hand, emphasizes the plasticity of cognition and training possibilities. As both approaches have provided ample empirical evidence for their views but have predominantly worked in isolation, this example clearly illustrates the need for a more integrative approach to multitasking. A challenge for the contemporary research on multitasking is to bring together the issues of structure, flexibility, and plasticity in human multitasking, offering a new integrative theoretical framework that accounts for this fundamental aspect of human behaviour.

    From one of the papers cited by the above reference (Hommel 2020),

    > A closer look reveals that the questions being asked in dual-task research are not particularly interesting or realistic, and the answers being given lack mechanistic detail. In fact, present theorizing can be considered mere empirical generalization, which has led to merely labeling processing bottlenecks rather than describing how they operate and how they actually produce the bottleneck.

    So, while I applaud the authors on generating buzz and discussion, I think their promising work will benefit from more serious consideration of the underlying neurophysiology.

  • tsimionescu 3 hours ago

    It seems very odd that the article seems to be measuring the information content of specific tasks that the brain is doing or specific objects that it is perceiving. But the brain is a general-purpose computer, not a speed-card computer, or English text computer, or binary digit computer, or Rubik's cube computer.

    When you look at a Rubik's cube, you don't just pick out specific positions of colored squares relative to each other. You also pick up the fact that it's a Rubik's cube and not a bird or a series of binary digits or English text. If an orange cat lunged at your Rubik's cube while you were studying it, you wouldn't process it as "face 3 has 4 red squares on the first row, then an orange diagonal with sharp claws", you'd process it as "fast moving sharp clawed orange cat attacking cube". Which implies that every time you loom at the cube you also notice that it's still a cube and not any of the millions of other objects you can recognize, adding many more bits of information.

    Similarly, when you're typing English text, you're not just encoding information from your brain into English text, you're also deciding that this is the most relevant activity to keep doing at the moment, instead of doing math or going out for a walk. Not to mention the precise mechanical control of your muscles to achieve the requisite movements, which we're having significant trouble programming into a robot.

    • gtirloni 2 hours ago

      My thoughts exactly. It makes no sense to me that what I'm thinking and perceiving in real-time is the equivalent of 10 bit/s of data.

      • sigmoid10 2 hours ago

        Has anyone here even read more than the title? The article literally explains that you perceive at a rate of 10^9 bits per second. But after filtering and preprocessing in the outer brain you are only left with about 10 bits per second for conscious processing for things like motor functions. Yes, you can see a Rubic's cube and perceive all sorts of facts about it and the environment at the same time. But try solving it with your hands while someone shows you a bunch of objects and asks you visual comprehension questions at the same time. You might still perceive those other objects, but consciously classifying them verbally is gonna be really difficult. It's no surprise that the feature space that your deeper brain can actively work on is quite limited.

        • burnte a few seconds ago

          I did read the summary and see what you saw, but that still leads me to believe the headline of the article is clickbait and that the authors don't understand that action signalling doesn't require the same bandwidth as information processing, but even then it's way more than 10b/s.

          Look at a backhoe. It has a few levers and a couple pedals. EXTREMELY simple interface. Each lever and pedal is basically 1 bit for engage/disengage, but the operator has to process orders of magnitude more sensory info to operate it properly. You could use an arduino to control a backhoe, but you'd need quite a powerful computer to know what to tell the arduino to do. This shouldn't surprise anyone. Knowing how to use the tool well is always far more complicated than simply knowing how the tool operates.

        • hathawsh 18 minutes ago

          Another way to put it: try to perform a skill you have never practiced. For example, if you've never played the piano or tried to read sheet music, see how long it takes you to play a few notes correctly. It's complex enough that you'll very likely find yourself limited to around 10 bits per second. You shouldn't count the bits handled by visual processing, basic motor control, and other things you have practiced all your life. If you practice the skill, the skill moves out of your conscious processing and no longer counts toward the 10 bits per second.

        • xattt 43 minutes ago

          Listen to a podcast at double speed. Assuming a normal talking speed of 150 words per minute, 300 words per minute of written word is not 10 bits per second.

          • 01HNNWZ0MV43FF a few seconds ago

            (Shannon estimates 11.82 bits per word, so 300 WPM is 59.1 bits per second)

          • sigmoid10 21 minutes ago

            Consider normal text compression and you're left with a few bits at best for most of those "fast talkers/listeners." And the human brain is very good at compression.

        • wbl an hour ago

          What conscious motor processing? My motor functions largely take care of themselves while I daydream when walking or consider where I want to steer while driving.

          • sigmoid10 23 minutes ago

            That's just motor reflexes that don't even enter higher cognitive processing. But daydreaming probably does process at the same rate as normal language, as was explained in the other comment. Try doing algebra in your head while running an obstacle course you've never seen and you'll be much slower at everything.

        • kens an hour ago

          > Has anyone here even read more than the title?

          Since it costs $35.95 to read the article, probably not. Seriously, paywalling of scientific research is obviously wrong.

        • gtirloni 44 minutes ago

          You're missing my point. I'm saying that `try solving it with your hands while someone shows you a bunch of objects and asks you visual comprehension questions at the same time` is more than 10 bit/s of data being processed. I'm saying made up "tasks and outcomes" in this study are not a measure of the brain's throughput IN THE INNER LAYERS.

        • mystified5016 2 hours ago

          10 bits per second is effectively nothing. Not even a single cell could operate at 10 bits per second. Every organic system would collapse immediately.

          • sigmoid10 an hour ago

            Remember, this is an intermediate encoding of a hidden feature space between perception and planning. What you see at the start and the end of the neural network might be very different. Consider this: Typing at 60 words/minute, 5 characters/word and 8 bits/character gives a gross bit rate of 40 bits/second. With today's compression algorithms, you can easily get 4:1 reduction in data. That leaves you at approximately 10bits/second that are consciously processed in your brain. Probably even less since your brain might be much better at encoding language than even our best models. Even if some of those numbers are off by a certain factor, the number in the paper is certainly in the right ballpark when you consider orders of magnitude.

      • kijin 14 minutes ago

        Send a query to your favorite database. The database preprocesses hundreds of gigabytes of data, map-reduces it, and finally returns a single 32-bit integer, taking exactly 3.2 seconds to do so.

        Nobody would say that the database can only process 10 bits per second. The query just happened to ask for a very simplified answer.

        • scotty79 6 minutes ago

          Everybody would say it outputs 10 bits per second. And when it comes to consciousness simplified answers at 10 bits per second is the best you can get. This article asks why.

    • wruza an hour ago

      Otoh, I just spent a minute to even comprehend your idea. We are living in the information era, the era of dodging an orange cat has ended. And we suck at this new thing, really. We’re like slow af emulators of logic programs and tend to fallback to animal mode when tired or clueless.

    • infogulch 2 hours ago

      Such reductionist analyses are reliably blind to the complexity of being embodied in physical reality.

      • scotty79 4 minutes ago

        Reductionism is the only way humanity ever progressed on anything.

      • WXLCKNO an hour ago

        Should they analyse everything all at once? Your comment seems reductionist to the realities of writing a paper on a specific subject.

    • colordrops an hour ago

      The PhD student writing this could be excused for being young an inexperienced but their advisor, tagged second on the paper, should have cut this off at the pass.

  • bennettnate5 3 hours ago

    > Why can we only think about one thing at a time?

    Maybe this is just a perception thing. Sure, you can only really keep up one stream of thought, visualization or inner dialogue (whatever you want to call it) at a time, but perhaps that's because we learn all our lives that direct communication is a one-channel, linear thing--speaking and listening focused on one topic at a time. Our brain does plenty of thinking in the background that leads to "a-ha!" moments even when the direct focus of our thoughts isn't on that topic. What if the mind could maintain multiple threads of thoughts at once, but our language coerces our thought patterns into being linear and non-concurrent?

    • Enginerrrd 3 hours ago

      As someone without an inner monologue, and someone that's spent a LOT of time meditating, it's not the language. It's the attention mechanisms themselves.

      Buddhist scholars insist that while we can have multiple threads of attention in our awareness, like strings with pearls of experience/thoughts we can only actually hold one little pearl of information from that stream in our attention at a time, and that we flit between them quite rapidly.

      Personally, I sort of agree, but I notice that there seems to be a time-compression thing happening where the pearl delivered to attention can contain a compressed summary of continuous perception. This seems to work for 2 things at once in awareness. When you start monitoring 3+ streams, there are gaps. And even maintaining the 2 streams continuously is exhausting so the mind tends to relax a little and leave gaps on a normal basis, but it seems like it can monitor dual feeds when its particularly important.

      My understanding is that neuroscience largely seems to agree with the above.

      (Actually, I'll note that the USUAL mode of being doesn't even monitor one stream continuously. A lot of the weird effects (and deeply interesting ones!) they talk about in meditative arts seem to pop up when you progress to being able to hold truly continuous attention.)

      • heyjamesknight 3 hours ago

        What you're describing here is software, not hardware—Cognitive Science is the relevant field, not Neuroscience.

        That said, your understanding is largely supported by our current understanding of consciousness, attention, and perception. The attention mechanism doesn't handle parallel processing well—but can operate "multi-threaded", where it juggles several foci at once (with some obvious cost to switching between them). But I think its a mistake to assume that decision making has to be done within this attention context. While we may only be aware of a single thread at any given time, the brain is doing a lot of parallel processing. We can only focus our attention on a single cognitive task, but that doesn't mean we're not actively performing many others.

        • mhluongo 2 hours ago

          What you're describing here is dualism and Descartes, in response to a post that references Buddhist scholars, a philosophy famously focused on monism.

          "Cognitive science" vs "neuroscience" as a concept is just how we decided to slice the problem up for academia.

          Next time, maybe cut the first paragraph ;)

        • wruza an hour ago

          we may only be aware of a single thread at any given time

          We may be not a single mind, but a bunch of minds. It just happens that the mind that “you” are reads this and has written the above comment, cause it’s of that kind (just like “all biological beings in this thread happen to be humans” type of a filter). Other minds can live completely different lives, just inside the same skull. And share emotions and thoughts with you sometimes from their prison.

          This “aware” part is pretty mysterious, because the physical mind could operate without it perfectly. But for some reason, the space containing a mind experiences this awareness thing.

      • davedx 2 hours ago

        Sometimes I'll be deeply thinking about something while driving, and discover I'm at the last road to my house without remembering having driven the previous few blocks. It's quite disturbing. When I say deeply thinking I don't mean anything involving phones or external stimuli - really just thinking about a particular problem I'm working on. I also don't deliberately engage this deep mode of thought, I just sort of slide into it naturally.

        Does anyone else have this happen? I don't think my driving is suffering, but it's hard to really honestly say?

        • digging 2 hours ago

          Yes, it's a classic example of the power and skill of your "unconscious" mind - your consciousness is freed up to do novel work because the drive home is so routine that your unconscious mind can do almost all of the work. Should something change - a traffic jam, a detour, a pedestrian crossing the road - your conscious attention will be called back to the more urgent task which is making a decision about how to handle the driving situation.

          • spigottoday an hour ago

            It seams interesting to me that what we refer to as the conscious mind is unconscious a third of each day and the part we call unconscious is active 24 by 7.

            • digging 11 minutes ago

              I'm out of my depth here, but a high-level response:

              First, I don't think the "unconscious" part is a single process, but myriad processes, and I'd bet they wax and wane.

              Second, the "conscious" part is the part that can reason about itself and think abstractly. I think it would be correct to say it's doing higher level computations. The important part is that this is more costly - it's not optimized because it has to be flexible, so it would make sense that it's resting as often as possible.

              • kijin a minute ago

                So, one high-performance, high-power, general-purpose processor to handle the foreground task, and a bunch of low-power processors for background tasks.

                Looks like ARM got it right with its big.LITTLE architecture. :)

        • lanstin 2 hours ago

          When I have a deeply engrossing unitary (I.e. not one of five tasks but one task for months) project at work I had better start commuting by train and cut out the driving. I have lost two cars by not doing that. Fortunately no one was hurt. One car I had towed to the work parking lot, and just never thought about it until some time after the project when it turned out the office just had it towed off as unknown junk. The project went well.

        • topherclay 2 hours ago

          The way most people refer to this is "driving on autopilot."

    • jdbxhdd 3 hours ago

      Also I do not agree with the premise that we can only think about one thing at a time.

      We routinely communicate with multiple people at once and also communicate with the same persons in multiple threads of conversations.

      Of cause this means that we switch between those tasks and do not really do them in parallel. At most we listen to one person, answer a second via speech, a third via text while thinking about what to respond to a fourth

      We just switch our focus of attention quite fast

      • imzadi 2 hours ago

        This is the part that bothers me. I can definitely think of multiple things at a time. It really just depends on the complexity of the tasks. I can listen to and process and audiobook while driving to work every morning, for instance. I definitely can have multiple thoughts in parallel. I remember when I used to recite prayers, I would be reciting the memorized prayer while thinking about other things. Both things were happening at the same time. The memorized task takes less processing power, but it still requires some thought to execute.

    • Bjartr 3 hours ago

      I wonder if some people with dissociative identity disorder, or who at least identify as plural, experience overlapping simultaneous trains of thought

      • pixl97 3 hours ago

        Heh if there are two yous occurring at the same time, one you would never know about it. Only third party observation would be able to tell you

        • Bjartr 2 hours ago

          That assumes clean swaps between personalities, I'd wager that it gets messier than that for some.

    • thmsths 3 hours ago

      I am not qualified to judge whether you're right or wrong but I love that concept!

    • NoMoreNicksLeft 2 hours ago

      We think about many things at a time. But for those with malfunctioning brains that have the internal monologue going constantly, they mistaken that monologue for their thoughts and so it must be "one thing at a time". The language they experience their monologue in is by its very nature, sequential, you can't speak or even hear/understand two parallel streams of speech.

      >Our brain does plenty of thinking in the background that leads to "a-ha!" moments even

      That's not "in the background". That's the real you, your real mind. That's the foreground. But, if your brain malfunctions as many do, then the monologue shows up and crowds out everything. Sometimes it is apparently loud enough that it even prevents those "a-ha!" moments.

      >but our language coerces our thought patterns into being linear and non-concurrent?

      The language should just be discarded. What you want is an internal silence.

      • lanstin 2 hours ago

        I wouldn’t say it’s language so much as unnecessarily added language. Words and sentences can appear and be useful, but there is a lot of mental activity that is not essential but added on responses to things. I wouldn’t say a component that generates comments is a broken brain, it believing the comments or the beliefs embedded inside them can break your contentedness.

  • mjburgess 4 hours ago

    > If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about 2^20 ≈ 1 million possible items in the few seconds allotted. So the speed of thinking – with no constraints imposed – corresponds to 20 bits of information over a few seconds: a rate of 10 bits per second or less.

    This is an extrinsic definition of "information" which is task relative, and has little to do with any intrinsic processing rate (if such a thing can even be defined for the imagination).

    The question of why does biological hardware capable of very high "intrinsic rates" deliver problem solving at "very low extrinsic rates" seems quite trivial. Its even a non-sequitur to compare them: properties of the parts are not properties of wholes. "Why does a gas move at 1 m/s, when its molecules move at 1000s m/s..."

    All the 'intrinsic processing' of intelligence is concerned with deploying a very large array of cognitive skills (imagination, coordination, planning, etc.) that are fully general. Any given task has requires all of those top be in operation, and so we expect a much slower rate of 'extrinsic information processing'.

    Consider how foolish the paper is to compare the intrinsic processing of a wifi network with the extrinsic task-specific processing of a human: it is likewise the case that if we set a computer the challenge of coordinating the solution of a task (eg., involving several LLMs) across a network, it's task-specific performance would drop off a cliff -- having a much slower 'solution rate' than 10bit/second.

    These 'task-specific bits' represent a vast amount of processing work to solve a problem. And are at least as much to do with the problem, than the system solving it.

    It seems to me all this paper does is define tasks in a highly abstract way that imposes a uniform cost to process '1 bit of task information'. Do the same for computers, and you'd likewise find tiny bitrates. The rate at which a problem is solved is 'one part of that problem per second' for a suitable definiton of 'part'

    • psb217 3 hours ago

      Another relevant point is the common anecdote about, eg, some master engineer who gets paid big bucks to come fix some problem that's been blocking up a factory for weeks. The engineer walks around, listens to a few of the machines, and then walks up to one of the machines and knocks it with his elbow Fonzi style and the factory starts running again. The factory boss is happy his factory is working, but annoyed that he paid so much for such an "easy" solution.

      Ie, the amount of input and processing required to produce the "right" 10 bits might be far larger than 10 bits. Another obvious example is chess. The amount of bits conveyed by each move is small but, if you want to make the right move, you should probably put some deeper thought into it.

      Humans are essentially organisms that collect and filter information, boil it down to a condensed soup of understanding, and emit a light sprinkle of carefully chosen bits intended to reshape the future towards their desires.

      • pixl97 2 hours ago

        Humans are nature's best designed filters.

        Or another way of saying it is, the answer was right there all along, the hard part was filtering all the non-answer out.

    • Ghostt8117 3 hours ago

      This type of comment is my least favorite on HN. "Seems quite trivial," "non-sequitur to compare them," "foolish." I am not able to read the paper as I do not have access, but the published perspective has 131 citations which seem to consider everything from task-specific human abilities, to cortical processing speeds, to perception and limb movements and eye movements, and so on.

      I'm glad you thought about it too, but to assume that the authors are just silly and don't understand the problem space is really not a good contribution to conversation.

      • cscheid 2 hours ago

        (Disclosure: I’m a former academic with more than a handful of papers to my name)

        The parent comment is harshly criticizing (fairly, in my view) a paper, and not the authors. Smart people can write foolish things (ask me how I know). It’s good, actually, to call out foolishness, especially in a concrete way as the parent comment does. We do ourselves no favors by being unkind to each other. But we also do ourselves no favors by being unnecessarily kind to bad work. It’s important to keep perspective.

    • pizlonator 3 hours ago

      I was about to say this but you beat me to it.

      Seems like this 10 number comes out of the kind of research where the objective isn’t to find the truth, but to come up with an answer that is headline grabbing. It’s the scientific equivalent of clickbait.

      Too bad people fall for it.

    • fwip 3 hours ago

      Exactly. English text is thought to have about 10 bits per word of information content, yet you can read much more quickly than 1 word per second. That includes not just ingesting the word, but also comprehending the meaning the author is conveying and your own reflections on those words.

  • zelon88 22 minutes ago

    Human's can transfer up to 39 bit/s during normal speech, so I highly doubt that it's accurate to describe human "throughput" as being only 10 bit/s.

    https://www.science.org/content/article/human-speech-may-hav...

  • crazygringo 5 hours ago

    Where do they get 10 bits/second?

    Heck, I can type way faster than 10 bits per second, even after gzipping the output.

    And when I consider the amount of sensory information that I consciously process (not that comes in, but that I conceptually analyze), it's got to be way higher.

    10 bits/s doesn't pass the smell test.

    • esperent 5 hours ago

      From the paper:

      > Quick, think of a thing... Now I’ll guess that thing by asking you yes/no questions.” The game ‘Twenty Questions’ has been popular for centuries1as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about 220≈ 1 million possible items in the few seconds allotted. So the speed of thinking – with no constraints imposed – corresponds to 20 bits of information over a few seconds: a rate of 10 bits per second or less.

      • chongli 4 hours ago

        As the answerer, if you have a wide vocabulary or if you're a technical person then it's not too difficult to routinely choose words the other person simply does not know so that no amount of yes/no questions will get them there.

        Obscure medical terms (phlebotomy), names of uncommonly-known stars (Fomalhaut), obscure data structures (cache-oblivious lookahead arrays), mathematical constants (Feigenbaum's constants)... The list goes on and on!

        The point I'm trying to make is that most people who play Twenty Questions aren't trying to maximize the number of bits per second in their answer. They're actually trying to play semi-cooperatively. The fun part of Twenty Questions is when the other person guesses your word with as few questions remaining as possible. Having them get all the way to 20 and then you tell them "no you were way off to guess toothache, it was actually temporomandibular joint dysfunction" makes you look rather unsporting!

        Thus, since I think we can expect people who play Twenty Questions to actually try to choose a word they know the other person can guess within the space allowed, we can reasonably conclude that using the game as a way to establish some sort of rough constraint on the speed of thinking (in bits per second) is way off. In fact, I know from my own experience playing the game that I will think of and discard many words in a short time as I try to find one that will be in the sweet spot of challenge for the other person to guess.

      • largbae 4 hours ago

        So, in the context of random word lookup with filter for things, we have a latency of a few seconds and a total selection of 20 bits.

        Meanwhile the machinery in understanding that it is a game, processing the audio input of the question, producing the output of the answer is all taken for granted.

      • IshKebab 4 hours ago

        It's nice when authors let you know you can safely ignore them so succinctly!

      • pro14 2 hours ago

        > Quick, think of a thing... Now I’ll guess that thing by asking you yes/no questions.”

        Every time I play this game, I can only think of one thing: https://t3.ftcdn.net/jpg/02/07/37/42/500_F_207374213_kNgoMel...

        So I guess that means I can only think at 1 bit per second.

        • lanstin 2 hours ago

          If there there is only one answer it is zero bits.

      • andersource 4 hours ago

        If the questions were pre-determined, which they're usually not. Reminds me of Huffman coding and the reason that compression challenges measure submissions looking at artifacts required to run them in addition to compressed size. I tend to agree with OP that this doesn't pass the smell test

      • crazygringo 2 hours ago

        What a truly bizarre method. There are so many things wrong with it I don't even know where to begin.

        No wonder they came up with such an obviously nonsensical answer in the end.

    • wat10000 5 hours ago

      English is about one bit per letter. If you type at a very fast 120WPM then you’re right at 10bps. Computers just don’t represent English very efficiently, even with gzip.

      • codedokode an hour ago

        What if you are typing not an English text, but a series of random letters? This gets you to 5-6 bits per letter.

        • wat10000 an hour ago

          I think this gets into what you consider to be “information.” Random noise is high entropy and thus high information in one sense, and zero information in another.

      • esperent 5 hours ago

        > English is about one bit per letter

        Where did you get that number from? How would you represent a letter using 1 bit?

        • wat10000 4 hours ago

          It’s an experimental result by Shannon: https://archive.org/details/bstj30-1-50/page/n5/mode/1up

          In short, you show someone an English text cut off at an arbitrary point and ask them to predict which letter comes next. Based on how successful they are, you can calculate the information content of the text. The result from this experiment was approximately one bit per letter.

          Representing it is not the concern of the experiment. I don’t think anyone has a scheme that can do this. But it’s straightforward enough in theory. You create a compressor which contains a simulated human English speaker. At each point, ask the simulation to rank all the letters that might come next, in order. Emit the rank of the actual next letter into your compressed data. To decompress, run the same procedure, but apply the ranks you read from the data stream to the simulation’s predictions. If your simulation is deterministic, this will produce output matching the compressor’s input.

          • dTal 2 hours ago

            >I don’t think anyone has a scheme that can do this

            If you substitute "token", for "letter", what you have described is exactly what a modern LLM does, out of the box. llama.cpp even has a setting, "show logits", which emits the probability of each token (sadly, only of the text it outputs, not the text it ingests - oh well).

            I don't think anyone actually uses this as a text compressor for reasons of practicality. But it's no longer a theoretical thought experiment - it's possible today, on a laptop. Certainly you can experimentally verify Shannon's result, if you believe that LLMs are a sufficiently high fidelity model of English (you should - it takes multiple sentences before it's possible to sniff that text is LLM generated, a piece of information worth a single bit).

            Oh look, Fabrice Bellard (who else?) already did it: https://bellard.org/ts_zip/ and you may note that indeed, it achieves a compression ratio of just north of 1 bit per byte, using a very small language model.

          • malfist 4 hours ago

            Say that experiment is correct. Wouldn't that imply that the information context of a single letter varies based on the possible future permutations?

            I.e., The string "I'v_" provides way more context than "con_" because you're much more likely to get I'm typing "I've" instead of "contraception"

            That seems to disprove the idea that a letter is a bit.

            Also the fact that there are more than two letters also indicate more than one bit, though I wouldn't want to even start to guess the encoding scheme of the brain

            • wat10000 4 hours ago

              I don’t follow. Of course the probabilities change depending on context. 1 bit per letter is an average, not an exact measure for each individual letter. There are cases where the next letter is virtually guaranteed, and the information content of that letter is much less than one bit. There are cases where it could easily be many different possibilities and that’s more than one bit. On average it’s about one bit.

              > Also the fact that there are more than two letters also indicate more than one bit

              This seems to deny the possibility of data compression, which I hope you’d reconsider, given that this message has probably been compressed and decompressed several times before it gets to you.

              Anyway, it should be easy to see that the number of bits per symbol isn’t tied to the number of symbols when there’s knowledge about the structure of the data. Start with the case where there are 256 symbols. That implies eight bits per symbol. Now take this comment, encode it as ASCII, and run it through gzip. The result is less than 8 bits per symbol.

              For a contrived example, consider a case where a language has three symbols, A, B, and C. In this language, A appears with a frequency of 999,999,998 per billion. B and C each appear with a frequency of one in a billion. Now, take some text from this language and apply a basic run-length encoding to it. You’ll end up with something like 32 bits per billion letters on average (around 30 bits to encode a typical run length of approximately 1 billion, and 2 bits to encode which letter is in the run), which is way less than one bit per letter.

            • taffer 3 hours ago

              > I.e., The string "I'v_" provides way more context than "con_" because you're much more likely to get I'm typing "I've" instead of "contraception"

              Yes the entropy of the next letter always depends on the context. One bit per letter is just an average for all kinds of contexts.

              > Also the fact that there are more than two letters also indicate more than one bit

              Our alphabet is simply not the most efficient way of encoding information. It takes about 5 bits to encode 26 letters, space, comma and period. Even simple algorithms like Huffman or LZ77 only require just 3 bits per letter. Current state-of-the-art algorithms compress the English Wikipedia using a mere 0.8 bits per character: https://www.mattmahoney.net/dc/text.html

        • taffer 3 hours ago

          In practice, it is even less. Current state-of-the-art algorithms compress the English Wikipedia using just 0.8 bits per character: https://www.mattmahoney.net/dc/text.html

      • samatman 2 hours ago

        Even very fast typists are unable to do stenography without a machine specialized to the task. Speech, in turn, can usually be understood at two or even three times the rate at which it is ordinarily produced. Meanwhile, I can read several times faster than I can understand speech, even at the highest speedup which I find coherent.

        Ergo, 10 bits per second just doesn't hold up. It's an interesting coincidence that a reasonably fast typing speed hits that rate, but humans routinely operate on language at multiples of it.

        • wat10000 2 hours ago

          I don’t think a difference of this magnitude meaningfully changes what the paper is talking about. They already have other human behaviors in their table with bit rates up to 5 times higher. Even if you set it at 100bps it wouldn’t change much. They’re addressing a difference of eight orders of magnitude. Making it seven instead of eight isn’t all that important.

      • formerly_proven 5 hours ago

        > English is about one bit per letter.*

        * when whole sentences or paragraphs are considered.

        • beng-nl 3 hours ago

          I’d say that is implied by “English.”

          Entropy is a measure of the source, not output.

        • wat10000 5 hours ago

          What else would we consider?

          • formerly_proven 5 hours ago

            The symbols aka words of the language itself?

            • wat10000 5 hours ago

              I’m afraid I don’t understand your point.

              If someone types English for a minute at 120WPM then they’ll have produced about 600 bits of information.

              Are you saying we should consider the rate in a smaller window of time? Or we should consider the rate when the typist is producing a series of unrelated English words that don’t form a coherent sentence?

              • mannykannot 4 hours ago

                From the paper:

                Take for example a human typist working from a hand-written manuscript. An advanced typist produces 120 words per minute. If each word is taken as 5 characters, this typing speed corresponds to 10 keystrokes a second. How many bits of information does that represent? One is tempted to count the keys on the keyboard and take the logarithm to get the entropy per character, but that is a huge overestimate. Imagine that after reading the start of this paragraph you are asked what will be the next let…

                English contains orderly internal structures that make the character stream highly predictable. In fact, the entropy of English is only ∟ 1 bit per character [1]. Expert typists rely on all this redundancy: if forced to type a random character sequence, their speed drops precipitously.

                [1] Shannon CE. Prediction and Entropy of Printed English. Bell System Technical Journal. 1951;30(1):50-64.

              • rvense 4 hours ago

                How do you measure information density of English text?

                • wat10000 4 hours ago

                  You show a bunch of English speakers some text that’s cut off, and ask them to predict the next letter. Their success at prediction tells you the information content of the text. Shannon ran this experiment and got a result of about 1 bit per letter: https://archive.org/details/bstj30-1-50/page/n5/mode/1up

                  • rvense 4 hours ago

                    OK. When talking about language I find it's always good to be explicit about what level you're talking about, especially when you're using terms as overloaded as "information". I'm not really sure how to connect this finding to semantics.

                    • wat10000 4 hours ago

                      If the text can be reproduced with one bit per letter, then the semantic information content is necessarily at most equal to N bits where N is the length of the text in letters. Presumably it will normally be much less, since there are things like synonyms and equivalent word ordering which don’t change the meaning, but this gives a solid upper bound.

    • GeoAtreides 5 hours ago

      The response to the question of "where do they get 10 bits/second" can be found in the paper, in great detail if I might add.

      • crazygringo 5 hours ago

        I don't have access. Nor do most of us here probably. Can you share for us then?

        • GeoAtreides 5 hours ago

          this thread has 20 comments at the time of writing my comment. About two of them contain a link to the full paper, please take a look.

          • ziddoap 3 hours ago

            It would be a lot less abrasive to say "It's linked elsewhere, but here it is: https://arxiv.org/abs/2408.10234" or some variation, instead of saying "it's here somewhere, go find it".

            • sam1r an hour ago

              Thanks for this. I scrolled for ages hoping for something like this ^

            • GeoAtreides 2 hours ago

              with all due respect, it was meant to be slightly abrasive. it's understandable (?) not finding something when the thread has hundreds of comments, not so much when the thread had like 15-20 comments.

      • t-writescode 43 minutes ago

        I was iterating over the different citations for bitrate, at least some of them, like Starcraft and the Rubik's cube, are literally a Guinness Book of Records that's a tiny blurb about APMs and a video of a guy solving the rubik's cube.

        Going from APM and/or image wiggling to "bits per second" is .... hilariously reductive and I struggle to consider this response to be woefully incomplete at convincing this reader.

        And yeah, my immediate response to reading the title was "where the hell are they getting that number", so I have gone and looked and am unsatisfied.

  • GeoAtreides 5 hours ago

    I beg you, please read the paper before commenting. It's very interesting and it answers a lot of questions that might arise from just skimming the title.

    • MrMcCall 5 hours ago

      That might be the funniest comment I've ever seen on HN!

      A plea to reason, that is probably not outside the posting guidelines, but is certainly in a gray area :-)

      • GeoAtreides 4 hours ago

        I honestly don't understand why it would be funny or in a gray area to recommend people to actual read the paper?

        • MrMcCall 3 hours ago

          Asking people to read the article before commenting? A commonsense suggestion that needs to be made makes me smirk inside, not the least because I am guilty of this, too, around here. (But not this time, thank you, kind Sir.)

          As to being in a "gray area", have you read the posting guidelines? ;-)

          I'm pretty sure it says we shouldn't say things like "read the article" or "you haven't read the article, have you?" in our comments.

          Anyway, I'm laughing at this community (myself included) and the fact that your innocent and well-intentioned comment needs to be said here. And it did and does, my friend!

          • DamonHD 21 minutes ago

            I am very very annoyed by many of the shallow "it's obviously wrong" comments on this story. And thank you to those rebutting more politely than I feel inclined to.

            It's a fascinating paper and something that I have been interested in since before [0] and ties in to a strand of work in my PhD research. Also see for example [1].

            [0] Stevens, M. Sensory Ecology, Behaviour, and Evolution, OUP Oxford, 2013, ISBN 9780199601783, LCCN 2012554461

            [1] CoupÊ, Christophe and Oh, Yoon Mi and Dediu, Dan and Pellegrino, François Different languages, similar encoding efficiency: Comparable information rates across the human communicative niche, American Association for the Advancement of Science (AAAS), 2019-09, Science Advances, volume 5, report/number 9, ISSN 2375-2548, doi:10.1126/sciadv.aaw2594

    • michaelt 5 hours ago

      Buddy, I followed the link and they want $35.95 to read the paper.

      This is... not a recipe for a successful discussion between people who have read the paper.

    • ganzuul 4 hours ago

      I conclude that if you perform horrific experiments on animals then our intelligent universe reduces the rate at which you can continue to 10bps.

      This is why enlightenment cures you of your curiosity.

      • MrMcCall 3 hours ago

        Only 10 beatings per second? This is a just universe, Sir!

        On a serious note, enlightenment only cures us of our selfish curiosity, i.e. any action which causes harm to others. The Way requires us to harmonize with universal compassion, so there is take and give (especially with regard to our required sustenance), but we definitely lose our propensity to experiment with our power at the expense of others. No, we are to increase our curiosity in how we can better help others, help being the cornerstone of compassion.

    • imtringued 4 hours ago

      I don't need to read the paper. The problem is that mechanical systems have inertia and are limited in their ability to change direction and thereby their ability to signal discrete information.

  • codedokode 2 hours ago

    > In particular, our peripheral nervous system is capable of absorbing information from the environment at much higher rates, on the order of gigabits/s. This defines a paradox: The vast gulf between the tiny information throughput of human behavior, and the huge information inputs on which the behavior is based. This enormous ratio – about 100,000,000 – remains largely unexplained

    The GPU is capable of performing billions of operations per second, yet Cyberpunk barely runs at 60 fps. And there is no paradox at all.

    By the way, the brain seems to perform better than a GPU at tasks like image recognition. Probably because it does even more operations per second than the GPU.

    • codedokode 2 hours ago

      There is also another comparison. Imagine if your goal is to calculate an integral over 100 dimensional space (or solve a quantum system) and answer whether it is larger or less than zero. This will take enourmous time but produces a single bit of information.

    • bee_rider 2 hours ago

      Is the brain better than a GPU at image recognition nowadays? Actually I’m not sure how that’s measured. Certainly a GPU could be tied to a database with a lot more things in it, like you can get some pretty impressive facial recognition demos where it’ll recognize a ton of random people.

      But humans can see objects they’ve never seen before and sometimes guess what they might be used for, which is sort of like object recognition but better. (Or sometimes I see an object I’m technically familiar with, like an old tool of my grandpa’s, and remembering what he used it for feels more like imagining… maybe it is).

  • gchamonlive 2 hours ago

    The authors of the article are smuggling in the assumption that 10bits/s is slow.

    It's slow when compared to general computing system that we implemented in silicon substrate.

    But this assumption doesn't translate linearly to the brain throughput and the perception of existence.

    In my opinion the hypothesis is meaningless.

    That is not to say the article is meaningless. Actually being able to measure brain information throughput is amazing. It's only that slowness isn't absolute.

  • l33tman 3 hours ago

    The optical nerve has an information density of around 10 MBit/s (ref https://pmc.ncbi.nlm.nih.gov/articles/PMC1564115/) Concentrating on only the symbolic thinking speed seems to be unnecessarily restrictive..

    • nabla9 an hour ago

      The only limit is the max size of Hacker News title allows. :)

  • metalman an hour ago

    I tried to read the article, but celldotcom has a presumably very high bit rate robot that promptly questioned my humanity, so I did the dishes and ate lunch, but that didn't get through somehow as proof.(of humanity) And so my recourse is to then read the coments here, to try and get the gist of the argument,but even well fed, doing 11 of maybe 12 bits per second, there does not seem to be any point in quibling with reality. Maybe after a bit of shock-o-late icecream, (B and G chockolate therapy with esspresso beans added)

  • MrMcCall 5 hours ago

    The only time 'bits' will ever be an appropriate measure of human processing is when we are processing or producing diginal information artifacts, e.g. a typewritten document.

    Our bodies' systems are biochemical wetware that will never be aptly described using a boolean basis. That is one of the primary problems of society's obsessions with classical notions of gender.

    No one is male OR female. We are, every single one of us, a combination of male and female hormones. The more "male" a person is is the result of that balance favoring the male hormones; and vice versa. What humanity is now struggling with is that there are plenty of folks with lots of both or little of either and all kinds of combinations.

    Of course, my not being a biochemist means my categorization of hormones into "male" and "female" is, itself, likely to be a poorly booleanized representation of their natures.

    We are much more akin to Boltzmann's statistical mechanics description of reality, than to digital logic's boolean description.

    • Asraelite 3 hours ago

      Bits are a perfectly acceptable way to measure biological information processing. These are not the boolean logic digital bits like on a computer. They're the more abstract concept of a bit in information theory.

      Take the number of distinct possible configurations a system can be in (accounting for statistical uncertainty/biases if needed), take the base 2 logarithm of that number, and you have the bits of information in the system. This can be applied to basically anything, biological or otherwise.

      • MrMcCall 3 hours ago

        But if your measurements are unreliable or downright flawed, then it's just garbage-in-garbage-out.

        Sounds like the statistics in the papers from the social "sciences".

        "There's lies, damned lies, and statistics." --Unknown

        I don't think you're going to be able to count the "number of distinct possible configurations" of an even moderately complex living system.

        • aeonik 2 hours ago

          It's more like statistical mechanics and the foundations of the second law of thermodynamics.

          Unless entropy is a damned lie. Which I'm not saying it isn't, but claiming such a thing is a pretty strong claim. Possibly one of the strongest claims you can make in physics (which is why it's associated with cranks).

          I'd expect some perpetual motion machines after overturning such a principle.

          But I do agree you need to be careful defining the scope of microstates and macro states.

      • aziaziazi 3 hours ago

        > Take the number of distinct possible configurations a system

        Easy for an isolated system. Human body is 6000 billion cells, each of them has many possible configurations, most of them share and process informations. I respectfully doubt there’s much to do with bits outside of a tiny bit if flesh in a petri dish.

    • beezlebroxxxxxx 4 hours ago

      > That is one of the primary problems of society's obsessions with classical notions of gender.

      What you go on to discuss is sex, and sexual dimorphism, which is a remarkably robust way of classification. The "classical" notions of gender (tbh, "classical" doesn't make much sense here) as sex based is fairly reasonable all things considered. Consider the arguments presented in this essay [0]. That, however, doesn't really mean much for how we should treaty people in public who desire to express their gender in different ways, which is, of course, respecting of their dignity and desires, in most cases.

      [0]: https://philosophersmag.com/unexceptional-sex/

      • MrMcCall 4 hours ago

        Well said.

        Yeah, what I mean by classical would boil down to just genitalia, which doesn't really hold up in how we must respect the person and how they feel and choose to express themselves. Yes, so long as their expressions are not harming others, then we must respect their human right to choose who they are.

        I've got to give a huge hat tip to Suzi (Eddie) Izzard, who -- beyond their being just a brilliant comic and generally good human being -- taught me and my fam about how the spectrum of human configuration is way more complex than just male and female.

        Cheers, friend.

    • GuB-42 2 hours ago

      The use of "bits" here doesn't mean we are working in binary.

      It is more like the way it is used in information theory. The number of bits is log2 of the number of states that can be represented, and it doesn't have to be an integer. For example, with 10 bits of information, we can distinguish between 1024 different states, it can be 1024 colors for instance, or 1024 genders if you wish, it doesn't matter, the important part is that there are 1024 boxes to put things in, no matter what they are. Of course, it doesn't mean that only 1024 colors exist in the universe, there are an infinity of them, but with 10 bits, you can only distinguish between 1024 of them. If you want more, you need more bits, if you can do with less, you need less.

      By the article results, it means your "inner brain" can process one color with 1024 nuances per second, or 2 independent colors with 16 nuances each per second. If the colors are not independent, it can process more, because, if, say, you know that the two color are highly contrasting, you don't have to allocate "boxes" for noncontrasting colors, may free some boxes for more nuances, so, you may, for instance, process two contrasting colors with 100 nuances each with these 10 bits.

    • VyseofArcadia 3 hours ago

      A bit is the fundamental unit of information theory, and has nothing to do with digital logic in this context. No one is saying "ok use one bit to encode male or female". No one is saying "ok 101 means serotonin, and 110 is dopamine". What they are saying is that the information content produced by a human being can be compressed down to about 10 bits per second, but this is a statistical description.

      • MrMcCall 3 hours ago

        You said both

          nothing to do with digital logic in this context
        
        and

          compressed down to about 10 bits per second
        
        Sounds like digital compression from where I sit, friend.

        Are you using an information theory that is based upon something different from Shannon's?

        • VyseofArcadia 3 hours ago

          "Compression" here is nontechnical, and I was using it by analogy as an aid to intuition. I didn't want to throw out the E word (entropy) unnecessarily.

          • MrMcCall 2 hours ago

            Are you using "bit" in a sense different to Wikipedia's definition, as linked from Claude Shannon's page?

    • Retric 3 hours ago

      Boolean logic extends just fine to handle complexity. Instead it’s the intuitive classification people come up with that are often a poor fit for reality.

      Is someone’s DNA consistent throughout their body? Y/N Does someone have any chromosomal anomalies? Y/N etc

      Similarly it’s very possible for a girl to suffer from abnormally low testosterone levels which doesn’t fit with how the public thinks of it as a gendered hormone. During puberty it normally spikes in both girls and boys. From a range of (2.5 - 10) in prepubescents, the typical range in puberty for boys is much higher (100 - 970) vs (15 - 38) but that doesn’t make it a male hormone just a pathway used differently.

    • psychoslave 4 hours ago

      >What humanity is now struggling with is that there are plenty of folks with lots of both or little of either and all kinds of combinations.

      Even that is a very smooth view of humanity as if was all going through more or less the same mindset.

      Rest assured that most of humanity don’t conceive their life experience according to a scientific measure of information units.

    • malfist 4 hours ago

      In biology, or really most sciences (math being an exception), the more closely you examine a delineated this or that categorization, the more you realize it's a scale, a range, or something fuzzy.

      Like even things we talk about regularly like touch and space is vague in the details. Is it still touching if the repulsive force of electron to electron is keeping nucleus apart? Where is empty space begin and an atom end? Is it after the electron shell? Outside of it's repulsive force? Some hybrid value?

      • psychoslave 4 hours ago

        Surely you will enjoy https://en.wikipedia.org/wiki/Fuzzy_mathematics

        Also remember that putting a topic under mathematical form or mere layman prose is also a spectral arbitrary categorization.

      • the__alchemist 3 hours ago

        To address your "empty space" question, you must first define, specifically, what you mean by this phrase.

      • MrMcCall 4 hours ago

        I hope you're not asking me those questions ;-)

        Yeah, those are great questions, for sure.

        I can always be awestruckdumb by the understanding that we are all mostly space inhabited by fields, our littlest bits vibrating at mindblowing speeds.

    • countarthur 2 hours ago

      What you're saying is interesting but I think the causality is backwards here and I can provide some examples to show why.

      (By male hormone I'm assuming you mean testosterone, and by female hormone I assume you mean oestrogen.) i in fact If being "more male" came from having more testosterone (and vice versa), then logically when children go through puberty and develop into adults, they would become "more" male or "more" female.

      As adults become elderly and naturally produce less sex-associated hormones, they would become "less" male or female.

      (Fetuses do not all begin in the womb as female, that's a common misunderstanding. We start off physically undifferentiated, and develop along a genetically predetermined pathway as we grow. Some animals use temperature or other environmental triggers to pick, humans use genes.)

      Would that mean a male bodybuilder who injects testosterone is more male than a man that doesn't? His phenotype may become visibly more masculine, but that doesn't change his sex at all. Same for a female bodybuilder that injects testosterone - she may develop stereotypically male physical characteristics like large muscles and a deeper voice, but her sex is unaffected.

      The causality is the other way: being male - or - female results in a physiology (adult testicles/ovaries) that produces sex associated hormones in larger or lesser degrees depending on the person (and in some cases very low amounts or not at all).

      This makes sense if sex is a binary (with rare differences of sex development - detailed here https://www.theparadoxinstitute.com/read/sex-development-cha... ) that results in different levels of sex hormones in the body and resulting phenotype. So yes, everyone is male or female.

      (I'm not referring to gender here - I'm talking only about sex)

      If there's a spectrum then some men could be biologically "more male" than others and vice versa for women. I've not seen any evidence of this myself, but I'm happy to be proven wrong!

    • gmadsen 3 hours ago

      it is a categorization, like all things in biology. One of the most robust and significant ones for all of life is sexual versus asexual reproduction. It is intentionally blurring understanding to say that it is not a binary. This is not a gaussian situation, and not fitting into this categorization is exceedingly rare due to defect/mutation which largely does not proliferate genetically.

    • johnnyjeans 3 hours ago

      > Our bodies' systems are biochemical wetware that will never be aptly described using a boolean basis.

      All physical systems are described on a base-2 basis using bits, or shannon entropy.

    • uoaei 3 hours ago

      I think you've mixed up a few mostly unrelated things together to make a point. You're correct in that the larger point to be made is that analog and digital computing are paradigmatically distinct and analogies are hard to draw across that divide.

      However, "bits" is just a quantity of information in a certain base. We could discuss it in "nits" if you prefer. The point is that information per se remains real even if the specific representation is based on some assumption of digital computing.

      The rest of your comment is unfortunately out of scope of this article although it deserves some discussion on its own merit.

  • titzer an hour ago

    > More generally, the information throughput of human behavior is about 10 bits/s.

    I'm sorry, I just can't take this article seriously. They make a fundamental mistake of encoding and assume that information is discretized into word-sized or action-sized chunks.

    A good example is a seemingly discrete activity such as playing a musical instrument, like a guitar. A guitar has frets and strings, a seemingly small number of finite notes it can play. So it would seem a perfect candidate for discretization along the lines of the musical scale. But any guitar player or listener knows that a guitar is not a keyboard or midi synth:

    1. The attack velocity and angle of the pick intones aggression and emotion, not just along a few prescribed lines like "an angry or sad or loud or quiet".

    2. Timing idiosyncracies like being slightly before or after a beat, or speeding up or slowing down, or even arhythmic; the entire expression of a piece of music is changed by subtleties in phrasing.

    3. Microbends. The analog nature of strings cannot be hidden entirely behind frets. Differences in the amount of pressure, how close to the fret the fingers are, and slight bending of the strings, intentional or unintentional, static or dynamic, change the pitch of the note.

    4. Non-striking sounds like the amount of palming, pick scraping, tapping, and sympathetic vibrations.

    Of course there are lots of other things. All of these things make the difference between a master guitar player, say Hendrix, and someone just playing the same notes.

    And yes of course we can consider the encoding of the audio coming out of the guitar to be information--at a much higher bitrate, but what about the facial expressions, body language, etc? There are tons of channels coming off a musician, particularly live performances.

    This entire article just misses these in picking a quantized encoding of information that of course has a low bitrate. In short, they are missing bazillions of channels, not the least of which is expression and timing.

  • aithrowawaycomm 5 hours ago

    I think the authors are using information theory to inappropriately flatten the complexity of the problem. On one hand we have “bits” of pre-processed sensory measurement data, then on the other hand we have “bits” of post-processed symbolic data: in many cases directly so via human language, but that would also include “the Terran unit moved a short distance” as a compact summary of a bunch of pixels updating in StarCraft. This even extends to the animal examples: the 10 bits/s figure applies to higher-level cognition. The crucial difference is that the sensory bits can be interpreted via the same “algorithm” in a context-independent way, whereas the higher-level cognition bits need their algorithms chosen very carefully (perhaps being modified at runtime).

    So I am just not sure why 10 bits/s of symbolic data processing is especially slow in the first place. We don’t have a relevant technological comparison because none of our technology actually processes data in that fashion.

    • kbelder 39 minutes ago

      I'm running a chat LLM on my local pc. It spits out text just slightly faster than I can type, but it is using much of my CPU and redlining my GPU.

      Is it processing at a dozen bits per second, or hundreds of millions?

      If the text the LLM generates is "that is true", can I consider that one bit of information?

      I agree, they're artificially simplifying the framing of the question to generate a lower number than is sensible.

    • aeonik 2 hours ago

      It's more like quantum information theory isn't it?

      https://en.wikipedia.org/wiki/Quantum_information

    • nis0s 5 hours ago

      When compared directly to the 10^9 bits/s for sensory information, which uses the same type of information, it is slow.

  • PittleyDunkin 3 hours ago

    > Why can we only think about one thing at a time?

    This sort of reasoning seems to be a symptom of inadequate communication/jargon/diction describing mental faculties. Many times during serious thought there's no discrete "number of thoughts" occuring at all: there's just a hazy mental process that resolves to some result and often many results. This reminds me of the "80% of people have no inner monologue!!!" bullshit that went around recently.

  • constantcrying an hour ago

    It seems the authors conflate a problem bring easy to state with little processing power being needed to solve it. This obviously isn't true, very complex mathematical problems can be stated in very few bits. Human interactions often can be extremely complex, even though they are relatively slow.

    Reading a text isn't about matching symbols to words. It is about taking these words and putting them into a social context, potentially even doubting their content or imagining the inner world of the author. Obviously that is what the "inner" brain (which existence seems very dubious to me) has to do.

    I see absolutely no paradox at all.

  • flerchin 3 hours ago

    On average, people can read at least 200 WPM, but much higher at the top end. This is orders of magnitude higher than 10 bps.

  • mulippy 2 hours ago

    Seems like onsciousness is the bottleneck. It has to integrate over all the perceptions. Of course this will be slower!

  • cabirum 6 hours ago

    > thinker can access about 220 ≈ 1 million possible items in the few seconds allotted

    Huh, no? No one is able to think about million items in a few seconds.

    The 20q thinking process involves bringing an incomplete set of abstract categories and asking a question that divides these categories into two halves (binary search). You don't even start from scratch, using previous experience (cache) to reuse whatever worked best the last time.

  • FL33TW00D 5 hours ago

    The title is a reference to "The unbearable lightness of being" by Milan Kundera for those unaware.

  • luka598 6 hours ago
    • tetris11 5 hours ago

      Really fun paper. I especially enjoyed this section:

      > Based on the research reviewed here regarding the rate of human cognition, we predict that Musk’s brain will communicate with the computer at about 10 bits/s. Instead of the bundle of Neuralink electrodes, Musk could just use a telephone, whose data rate has been designed to match human language, which in turn is matched to the speed of perception and cognition

      • capitainenemo 3 hours ago

        It might be though that even though our processing rate is limited to 10 bits per second, shortening the communication loop between the helper AI and the human might allow the human to switch subjects more productively by getting faster feedback. The human would be in an executive approval role like the lead character in Accelerando with their agents, assuming they trusted their agents to be delegated to.

  • devenson 40 minutes ago

    Thinking is emulated -- therefore much slower.

  • linuxdude314 3 hours ago

    FWIW the title of this article is a play on the title of Milan Kundera's famous book "The Unbearable Lightness of Being".

  • gmuslera 5 hours ago

    Why not? Processing an input of 10^9 bits, making sense of all of that, and contrast it against all your existing knowledge have an output speed 10 bits/s? It is not so bad. At least if we were really processing all that information in the same way.

    It had to be enough to let us survive, in the context of the challenges we faced through most of our evolution. We took a lot of shortcuts and trims there, that is why we have a system 1 and system 2 in place, with a lot of builtin cognitive biases because of that.

  • Helmut10001 an hour ago

    I read somewhere that the eye transmits 10 Million bits per second to the brain. I think all of this is a matter of perspective.

  • nis0s 6 hours ago

    It seems this analysis is incorrectly assuming a serial communication mode for neuronal transmission, which isn’t what happens.

    • GeoAtreides 5 hours ago

      I have read the paper and your thesis, that the analysis is assuming a serial communication mode for neuronal transmission, is incorrect.

      • nis0s 5 hours ago

        They say this directly in section 7.2,

        > In contrast, central processing appears to be strictly serial…

        and then they proceed to give misinterpretated evidence of serialization because they’re making assumptions about lower level biochemical behavior based on higher level tissue performance. In fact, that tissue-level behavior isn’t correctly described either.

        • GeoAtreides 4 hours ago

          Be honest.

          The whole paragraph is:

          "In contrast, central processing appears to be strictly serial: When faced with two tasks in competition, individuals consistently encounter a “psychological refractory period” before being able to perform the second task broadbent_perception_1958, pashler_dual-task_1994. Even in tasks that do not require any motor output, such as thinking, we can pursue only one strand at a time."

          Clearly they're not talking about "neuronal transmission", but tasks, and further more, they cite their sources.

          • nis0s 2 hours ago

            I wasn’t being “dishonest”, I couldn’t copy/paste the entire text on my phone.

            I addressed the rest of that statement in my comment by noting that you can’t make the same assumptions about biochemical reactions and emergent behaviors of tissues.

            Secondly, even from a neurophysiology perspective, their cited evidence is misinterpreted. Any basic dual N-back task proves their central thesis incorrect.

  • cebert 6 hours ago

    > Plausible explanations exist for the large neuron numbers in the outer brain, but not for the inner brain, and we propose new research directions to remedy this.

    Going on a limb here, but perhaps we shouldn’t modify biological composition of the human brain.

    • mtlmtlmtlmtl 6 hours ago

      They're not talking about changing the brain. They're talking about remedying the lack of a plausible explanation.

      • thegeomaster 6 hours ago

        I think the parent commenter was making a joke.

  • ayongpm 6 hours ago

    Interesting summary, where can I get the full text?

  • cess11 4 hours ago

    I see there are some people in the thread that doubt the low bandwidth between conscious thought and the rest of the central nervous system.

    Do you also doubt that you're actually living half a second in the past, with the brain compensating for this lag between initial perception and conscious reception of the indirect effects of it?

  • Zondartul 2 hours ago

    Ask stupid questions, receive stupid answers.

  • wigster 4 hours ago

    perhaps the 10 bits/s is throttled at the simulation level ;-)

  • newswasboring 5 hours ago

    Found the pre print if you don't have access

    https://arxiv.org/abs/2408.10234

  • VoodooJuJu 3 hours ago

    Bits are a unit of measurement we use in relation to computers. Humans are not computers. Do not use bits to measure anything in relation to humans. Stop thinking of humans as computers. It's dehumanizing, unhealthy, and a very narrow way to think of people. Using a computer model for humans is useless at best and misleading at worst.

    • skibz 2 hours ago

      This paper isn't proposing that humans can be understood using a "computer model".

  • m3kw9 3 hours ago

    Our awareness of time seem arbitrary. If feels time went by fast/slow, how does that work anyways?

  • nusl 6 hours ago

    It's 1⁢0^9 bits/s. Your title is wrong.

    • ceejayoz 6 hours ago

      > The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ∟1⁢0^9 bits/s.

      The title appears to be accurate?

      • mxfh 4 hours ago

        Just for playing any sport the accuracy to instruct 100s of muscles to work in a certain way is certainly above that 10bits,

        Pointing out positions in a 10cm x 10cm x 10cm cubic volume seems to possible significantly faster than 1/s.

        The slower examples listed in the table all have some externalities like a motor/object manipulation feedback loop overhead (speed cubing) and or redundacy and are not optimized for pure information density, so I have no idea why they settled on that average, and not the optimum?

        Object Recognition and Reading are already at ~50 bits.

        https://arxiv.org/html/2408.10234v2#S3

        • ceejayoz 3 hours ago

          > Just for playing any sport the accuracy to instruct 100s of muscles to work in a certain way is certainly above that 10bits

          But significant portions of that process are not done by the conscious brain, and some aren't done by the brain at all (reflex and peripheral nervous system). We don't consciously think about each of the 100 muscles we're switching on and off at rapid speed.

  • kidel001 2 hours ago

    These types of articles are so fundamentally flawed... it beggars belief. Why not ask the opposite question: if bandwidth works the way they describe, why can't H100 GPUs (3TB/s bandwidth) perform sensorimotor tasks 24 trillion times faster than a human? (Spoiler alert: they can not).

    <s> Could it be... there is a bit of a straw man argument here? About how much information it actually takes to input and output a complete sensorimotor task? I dare say! </s

  • lccerina 6 hours ago

    This is such a bad paper. Almost all calculations and equations look like some back of envelope calculation. A decent researcher would have provided some tests to their hypotheses.

    • GeoAtreides 5 hours ago

      The numbers cited and used in calculations are supported by citations. The purpose of this paper is not to test a hypothesis, or to gather new data, but to think about existing data and new directions of research. This is spelled out in the paper's abstract, which is kind of summary of the whole paper, useful to get a very quick idea about the paper's purpose -- expanded further in the paper's introduction and re-visited again in the paper's conclusion.

      • lccerina 3 hours ago

        Thank you for explaining what an abstract is... The fact that those number come from a citation doesn't make them true. This is a badly written paper that a decent researcher wouldn't have written (and I know that the author has many papers, I am speaking about this one) and a decent reviewer would have rejected. A paragraph about Elon Musk? Guesstimates on information rates? As a blog post would have been okay-ish, as a scientific paper is quite bad.

  • codedokode an hour ago

    The 10 bit/s processing rate doesn't explain why a human talks better than a LLM that consumed terabytes of data traffic during learning.