NIH: 100M Years to Change a Binding Site

Status
Not open for further replies.

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
If that makes me a bigot, fine with me.

It's this that makes you a bigot...
..if I ... thought it drove another stake into poor Chuck's heart...

Fundies would rather quote mine. Much easier and easier to sell to the true believers. the real issue is the lack of intellectual ability to understand, well actually the real issue is the failure to really bother to try to understand.
... you mental midget.
 

Jukia

New member
No, it's nowhere near accurate. And, yeah. You are a bigot.

Atheists generally are when it comes to these discussions.

Nope, this is what I said:

"No, I think what Alate suggested was that the paper does not support Pastor Bob. However, to reach that conclusion one would need to read the paper and understand the math rather than just reading the abstract or the review of the abstract/paper by some IDer.
For the record, I took a quick look at the paper and am sure it would take me a good 2 weeks to relearn the math (assuming I ever knoew it in the first place!) to understand it. Although, if I were really interested and thought it drove another stake into poor Chuck's heart I might even try to contact the authors. Fundies would rather quote mine. Much easier and easier to sell to the true believers.
the real issue is the lack of intellectual ability to understand, well actually the real issue is the failure to really bother to try to understand.
Color me surprised."

Totally accurate. You are simply too closed minded and stuck on the ancient writings of goat and sheep herders to bother to even attempt to understand.
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
Nope, this is what I said:"No, I think what Alate suggested was that the paper does not support Pastor Bob. However, to reach that conclusion one would need to read the paper and understand the math rather than just reading the abstract or the review of the abstract/paper by some IDer.
For the record, I took a quick look at the paper and am sure it would take me a good 2 weeks to relearn the math (assuming I ever knoew it in the first place!) to understand it. Although, if I were really interested and thought it drove another stake into poor Chuck's heart I might even try to contact the authors. Fundies would rather quote mine. Much easier and easier to sell to the true believers.
the real issue is the lack of intellectual ability to understand, well actually the real issue is the failure to really bother to try to understand.Color me surprised."Totally accurate. You are simply too closed minded and stuck on the ancient writings of goat and sheep herders to bother to even attempt to understand.
Bigot.
 

Frayed Knot

New member
I listened to the show, and was disturbed to hear information theory bungled up so badly. Bob and Fred were yucking it up about those crazy evolutionists/mathematicians who say that making something more random actually increases the information content.

Yet again, Bob and Fred just don't understand the math and science. Increasing the randomness of a string of information does increase its information content. Complexity is the same as information content, which is measured as entropy. Random sequences have maximum complexity, therefore maximum information content and entropy.

This is NOT controversial; it's settled.

As an aside, anyone who tries to apply the 2nd Law of Thermodynamics to information is saying that entropy has to increase, which is the same as saying that information must increase. (However, it's not valid to apply the 2nd LOT to information in the first place.)
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
Yet again, Bob and Fred just don't understand the math and science. Increasing the randomness of a string of information does increase its information content. Complexity is the same as information content, which is measured as entropy. Random sequences have maximum complexity, therefore maximum information content and entropy.
No, you're wrong. Information is increased in Shannon by having a string of data that has more of certain values than others. Like a piece of literature has more 'e's than 'x's. Randomising the data set levels out those differences and decreases the information.

As an aside, anyone who tries to apply the 2nd Law of Thermodynamics to information is saying that entropy has to increase, which is the same as saying that information must increase. (However, it's not valid to apply the 2nd LOT to information in the first place.)

The second law of thermodynamics is how we describe entropy working on hot bodies. A very similar description (practically identical) describes how entropy works on information.
 

Frayed Knot

New member
No, you're wrong. Information is increased in Shannon by having a string of data that has more of certain values than others. Like a piece of literature has more 'e's than 'x's. Randomising the data set levels out those differences and decreases the information.
No, you have it backwards. Shannon defined information content as (the inverse of) the probability of getting a message. If you know that a sequence has more of one letter than another, it tells you something about it ahead of time, meaning that it has less information. The way to maximize the information content of a string of data is if the data is completely unpredictable; i.e., random.

Of course, Claude Shannon was an atheist so he probably said this just to disobey God or something.

The second law of thermodynamics is how we describe entropy working on hot bodies. A very similar description (practically identical) describes how entropy works on information.
If you say so, but you're saying that information content must increase over time. Complexity, information content, and entropy are all describing the same thing.
 

Frayed Knot

New member
Stripe, I came across the Wikipedia article on Entropy (information theory) which goes into more detail about what I said, but pretty explicitly backs me up. Some key points from there (bolding is mine):

Entropy is a measure of disorder, or more precisely unpredictability. For example, a series of coin tosses with a fair coin has maximum entropy, since there is no way to predict what will come next. A string of coin tosses with a two-headed coin has zero entropy, since the coin will always come up heads. Most collections of data in the real world lie somewhere in between.
English text has fairly low entropy. In other words, it is fairly predictable. Even if we don't know exactly what is going to come next, we can be fairly certain that, for example, there will be many more e's than z's, or that the combination 'qu' will be much more common than any other combination with a 'q' in it and the combination 'th' will be more common than any of them. Uncompressed, English text has about one bit of entropy for each byte (eight bits) of message.
If a compression scheme is lossless—that is, you can always recover the entire original message by uncompressing—then a compressed message has the same total entropy as the original, but in fewer bits. That is, it has more entropy per bit. This means a compressed message is more unpredictable, which is why messages are often compressed before being encrypted. Shannon's source coding theorem says (roughly) that a lossless compression scheme cannot compress messages, on average, to have more than one bit of entropy per bit of message. The entropy of a message is in a certain sense a measure of how much information it really contains.

and

Shannon's entropy measures the information contained in a message as opposed to the portion of the message that is determined (or predictable). Examples of the latter include redundancy in language structure or statistical properties relating to the occurrence frequencies of letter or word pairs, triplets etc. See Markov chain.

So the predictable things that you mentioned earlier, the fact that the letter 'e' is more common than the letter 'z', is an example of why English text has less information content than a random string of the same length.

It's probably too much to ask Bob and Fred to man-up and retract what they said, but I would expect any honest person to at least stop making false claims after it's been shown to them to be false.
 

Alate_One

Well-known member
Sounds like something we could discuss. Why didn't you bring this up in your first post instead of the vague and irrational content you chose?
You were the only one that failed to comprehend. Everyone else seemed to manage quite well.

Who is Michael Behe and who needs this paper to intend non-support of evolution?
The very purpose of the paper was to attack and debunk the exact idea that Bob is promoting (which was far more clearly put forward in Behe's book). The idea is essentially that evolution is "too hard" mathematically. The particular TYPE of change that was tested (and found to not happen very often) was one that was very difficult. It's not as if every binding site change is equally problematic.

Was the abstract wrong?
No, but drawing the Conclusions Bob and co. did from it is moronic to say the least. Saying that a paper that is making the exact opposite point to you actually supports you is just plain stupid.

The thing is all of this argument over binding sites is probably irrelavent anyway since binding sites don't actually NEED to change very often in many cases.

And apparently it was actually LOSS of DNA that distinguishes we humans from other organisms. Kinda throws a little monkey wrench ;) in your information theory silliness.
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
the predictable things that you mentioned earlier, the fact that the letter 'e' is more common than the letter 'z', is an example of why English text has less information content than a random string of the same length.
Oops. You're right. Thanks for the correction. :up:

It's probably too much to ask Bob and Fred to man-up and retract what they said, but I would expect any honest person to at least stop making false claims after it's been shown to them to be false.
Actually, they are still correct. The example they used was a beach full of sand. Mixing that up doesn't make the beach more random.
 

Nick M

Black Rifles Matter
LIFETIME MEMBER
Hall of Fame
[ And according to an article at the National Institutes of Health, it would take 100 million years by a Darwinian process to change a single binding site in the human genome. Oops. Supposedly ALL OF HUMAN EVOLUTION from small Australopithecus chimp-like creatures to Homo sapiens has happened in only five million years.

Being generous, mitochondrial Eve's age of only 200,000 years really makes human-monkey 5,000,000 impossible. Let alone the real number of just over 6000 years.
 

Jukia

New member
Listened to the show instead of only reading the abstract yet, Alate?

Obviously, Alate must have some problems understanding the OP based on the show. I suggest that perhaps Pastor Bob or one of his buddies make the OP clearer so Alate can really understand that it makes sense.

I'll check back to see if that gets done.
 

Jukia

New member
Listened to the show instead of only reading the abstract yet, Alate?

The real question is whether Bob, his sidekick or you have read the original paper, not just the abstract. Of course the next issue is whether or not Bob, sidekick or you understand the paper.
If you read it and still have difficulty understanding there is always the option of contacting the people who did the work to have it explained, in my experience, most reseachers are happy to discuss their work.

But we all know what is more likely---continued misrepresentation which makes Bob and his minions feel superior but which neither advances understanding nor wins converts for Jesus.
 

Jukia

New member
Being generous, mitochondrial Eve's age of only 200,000 years really makes human-monkey 5,000,000 impossible. Let alone the real number of just over 6000 years.

And we know this because of all the research creation scientists do.
 
Status
Not open for further replies.
Top