Discussion thread for Bob and Johnny's One on One

Status
Not open for further replies.

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
How do you manage it , Stripe? You so often seem on the verge of saying something concrete and intelligent, but somehow you just never get there. How do you do it?
What are you talking about? :idunno:

It seems that you admit that there are no current laws in thermodynamics or information theory that make evolution possible- but there could and should be. Is that it? Or is it something else?
Entropy describes the tendency of all things to break down and discontinue. Everything we see that is not broken down and useless can be explained by the action of intelligence or by clearly understood or observed physical processes.

Evolution does not have such an answer.

Styer has attempted to refute the challenge from the second law of thermodynamics. According to you he has ignored or not understood that or for some other reason omitted the fact that entropy applies to more than just thermodynamics.

Instead of pretending that I'm not capable of saying something intelligent, how about you back up this correct analysis of Styer's paper (the same one I made when LoL posted it) or give an answer to the challenge?
 

chair

Well-known member
What are you talking about? :idunno:


Entropy describes the tendency of all things to break down and discontinue. Everything we see that is not broken down and useless can be explained by the action of intelligence or by clearly understood or observed physical processes.

Evolution does not have such an answer.

Styer has attempted to refute the challenge from the second law of thermodynamics. According to you he has ignored or not understood that or for some other reason omitted the fact that entropy applies to more than just thermodynamics.

Instead of pretending that I'm not capable of saying something intelligent, how about you back up this correct analysis of Styer's paper (the same one I made when LoL posted it) or give an answer to the challenge?

Teh paper is about the second law of THERMODYNAMICS and how it relates to evolution. It doesn't claim to do anything else.

If you can state your challenge clearly and logically, I could try to relate to it. So far you have come up with the challenge being not the 2nd law of thermodynamics, and not any current laws of information theory.

Your "challenge" then, comes down to the idea that information in general can't increase without the intervention of an intelligent being. Is that correct?
 

kmoney

New member
Hall of Fame
Johnny said:
I'd love to get some reader input on this in the discussion thread:
Is Styer unclear as to what the misconceptions are?
Do you think this was bad writing on Styer's behalf?
Should this argument be extended any longer?
I think that to explicitly state the misconceptions instead of listing corrected statements, as you say he did, would have been more clear. However, I don't think the way he did it should present any real problems. Any reader should be able to understand. And no, I don't think this point is important enough to keep arguing. :nono:
 

ThePhy

New member
The SLoT is a problem for evolution. All heat must be controlled to keep it from breaking things down as fast as it could build things up.
Rather than hand-waving and claiming that the SLoT says this or that, how about doing as Styer did and plug in the numbers? The SLoT has a very precise mathematical formulation. If you feel the math is above you, then Fred Williams of BEL Real Science Friday fame is intimately involved with a group that claims hundreds of creationists with advanced degrees. Surely a few of those can handle the math.
 

Yorzhik

Well-known member
LIFETIME MEMBER
Hall of Fame
So you do NOT agree, then. With me, or with Bob Enyart, for that matter.
No. I agree with Bob Enyart. Please understand, my argument is a problem for evolution. Bob's argument is ANOTHER problem for evolution.

You asked specifically about the thermo aspect of entropy, and so I answered your question directly about where the problem is. But I didn't elaborate because this thread is about Bob's thread and his argument.
 

Yorzhik

Well-known member
LIFETIME MEMBER
Hall of Fame
Rather than hand-waving and claiming that the SLoT says this or that, how about doing as Styer did and plug in the numbers? The SLoT has a very precise mathematical formulation. If you feel the math is above you, then Fred Williams of BEL Real Science Friday fame is intimately involved with a group that claims hundreds of creationists with advanced degrees. Surely a few of those can handle the math.
No hand waving required. It's your theory, you have to do the math. Then I can come along and see if your numbers add up.

We can start another thread about it.
 

Flipper

New member
Yeah but the typical challenge by creationists is not one of information entropy, but rather that evolution is in violation of the SLOT. Yorzhik has made it himself.

I am pleased to see that Bob agrees that this is based on a misunderstanding of the SLOT. Interestingly though, Answers In Genesis does not include the argument from the SLOT on its "Arguments Creationists Should Not Use" web page, also I suppose it could be because it hasn't been updated recently.

Interestingly, there is a thermodynamics-related question in there, but it doesn't cover using it as an argument against evolution. Instead, it is "The 2nd Law of Thermodynamics began at the Fall."

Maybe Bob should get his webmaster friend at AIG to get that page updated. After all, lots of creationists seem to be mistakenly using thermodynamics as an argument.

And Yorzhik, remember that time you totally failed to explain how evolution is in violation of the SLOT when hurricanes somehow aren't? Get cracking!
 

ThePhy

New member
No hand waving required. It's your theory, you have to do the math. Then I can come along and see if your numbers add up.

We can start another thread about it.
Styer's paper has the numbers. Get cracking and tell us if the numbers add up. No new thread needed.
 

ThePhy

New member
Uncontested Touchdown

Uncontested Touchdown

In the one-on-one it appears there is not much left to discuss as far as to whether the SLoT precludes evolution. Styer is standing in the end zone with the football, the coach is signaling a touchdown, but the crowd is arguing over why Styer went left instead of up the middle, and why the popcorn is stale, and whether the cloudy weather might be to blame. Even if it could be shown that 98% of the secular scientists conflated evolution and information theory, and 102% of the creationists did the same, still Styer’s mathematical analysis of Thermodynamic entropy and evolution is untouched.
 
Last edited:

Bob Enyart

Deceased
Staff member
Administrator
Phy, just wondering what your thoughts are on this "nearby" thing...

Phy, just wondering what your thoughts are on this "nearby" thing...

Phy,

Could you reply to this from the 1 on 1:

Second, did Styer overstate his case when he said: "the entropy of any part of the universe can decrease with time, so long as that decrease is compensated by an even larger increase in some other part of the universe."

For example, can an entropy increase in ANY Location 1 really compensate for an entropy increase in Location 2, as in:
Location 1: outer space to one parsec around Alpha Centauri
Location 2: equipment operating on the Phoenix Mars Lander (NASA's has finally lost its signal by the way).

The "parts" of the universe that have the offsetting entropies must be adjacent. No? For example, a discrete amount of decreased entropy in Denver Colorado, say from an air conditioner cooling Denver Bible Church, cannot be accounted for by a slightly greater increase in entropy on Planet FFTE, a planet orbiting a star in a galaxy furthest from the earth. I realize the entire physical universe is "connected" (CMB light, etc.). But isn't it true that the offsetting entropy must occur contiguous to the decreasing entropy, in that the distances separating these must be close enough to physically allow for the transfer of entropy? Thus I'm asking if it is slightly misleading (and I'm not making a federal case out of this Johnny, just asking) to a college student reading AJP to say, "the entropy of any part of the universe can decrease [if] compensated by an even larger increase in some other part of the universe."​

Phy, Styer's sixth reference (from his second inferred misconception :) ) is to the pages by John Patterson which include this quote:

"According to the second law, the entropy decrease (ΔS2 < 0) may occur spontaneously as long as it is coupled to increases... that overcompensate the entropy inventory nearby."

Just wondering what your thoughts are on this.

Thanks,

-Bob Enyart
KGOV.com
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
Teh paper is about the second law of THERMODYNAMICS and how it relates to evolution. It doesn't claim to do anything else.
We know. The problem is (and the discussion is centred around) whether it is hiding an information component.

If you can state your challenge clearly and logically, I could try to relate to it. So far you have come up with the challenge being not the 2nd law of thermodynamics, and not any current laws of information theory.
E-N-T-R-O-P-Y.

You do know what entropy is, right? You do know there is more than one kind, right?

Your "challenge" then, comes down to the idea that information in general can't increase without the intervention of an intelligent being. Is that correct?
No.

Known (uninformed) physical processes can also lower local entropy.

But feel free to keep asking the same questions, chair. Perhaps I'll get frustrated and answer one incorrectly so you can continue to respond to anything but the point...
 
Last edited:

ThePhy

New member
Entropied Apples

Entropied Apples

Phy,

Could you reply to this from the 1 on 1:

Second, did Styer overstate his case when he said: "the entropy of any part of the universe can decrease with time, so long as that decrease is compensated by an even larger increase in some other part of the universe."

For example, can an entropy increase in ANY Location 1 really compensate for an entropy increase in Location 2, as in:
Location 1: outer space to one parsec around Alpha Centauri
Location 2: equipment operating on the Phoenix Mars Lander (NASA's has finally lost its signal by the way).

The "parts" of the universe that have the offsetting entropies must be adjacent. No? For example, a discrete amount of decreased entropy in Denver Colorado, say from an air conditioner cooling Denver Bible Church, cannot be accounted for by a slightly greater increase in entropy on Planet FFTE, a planet orbiting a star in a galaxy furthest from the earth. I realize the entire physical universe is "connected" (CMB light, etc.). But isn't it true that the offsetting entropy must occur contiguous to the decreasing entropy, in that the distances separating these must be close enough to physically allow for the transfer of entropy? Thus I'm asking if it is slightly misleading (and I'm not making a federal case out of this Johnny, just asking) to a college student reading AJP to say, "the entropy of any part of the universe can decrease [if] compensated by an even larger increase in some other part of the universe."​

Phy, Styer's sixth reference (from his second inferred misconception :) ) is to the pages by John Patterson which include this quote:

"According to the second law, the entropy decrease (ΔS2 < 0) may occur spontaneously as long as it is coupled to increases... that overcompensate the entropy inventory nearby."

Just wondering what your thoughts are on this.

Thanks,

-Bob Enyart
KGOV.com

The Second Law mathematics used by Styer does not depend on the heat exchange being between adjacent objects. There is good reason for this.

One goal of expressing scientific laws is to “minimize the inputs”. In other words, do not include any preconditions that must be met unless the correctness of the law depends on them.

Example – Newton’ Law of Gravitation. The apocryphal apple bounced off Newton’s head, he looked up and saw the moon, and suddenly a realization that both the apple and the moon were being attracted by the earth’s gravity came to him. Had he written and published his Law of Gravity in the next few minutes, it may have said that the force of gravity was [ F = g * Me * Mo / (Re^2) ] where F is force, g is the gravitational constant, Me is the mass of the earth, Mo is the mass of the object (moon or apple or …), and Re is the distance from the center of the earth. Correct, but it is only a subset of the Law of Gravity he actually put forth. Why?

Being the insightful (fringe) Christian scientist that he was, Newton realized that the force of gravity was acting not only between the earth and moons and satellites and falling applies, but between any two objects that have mass. Bob’s computer (Al the 6th) in Denver is pulling on this computer (Alice the 7th) some 1000 miles away, with the force between them exactly described by the real Law of Gravity: [ F = g * Ma * Mb / (R^2) ] where Ma is now the mass of the first object, Mb is the mass of the second, and R is the distance between them. (I keep a small block of lead in my office near my wall opposite Denver, just to counter the pull of Al on Alice.)

So in the context of minimizing the restriction on the inputs, Newton made his law much more useful by deleting any need for one object to be the earth, or close by.

A similar rule holds on the SLoT. Use the minimum number of restrictions possible in developing the law. Nothing in the mathematical formulation of entropy stipulates locality. It deals only with energy budgets. (I use the word “budget” because of Johnny’s insightful response about money in the 1-on-1).

If we move away from the rigid formalism of the mathematical logic, isn’t it true that energy exchanges are always somewhat local? Yes, as far as we can tell right now. I think Styer was perhaps a bit extreme in his examples, but maybe he was trying to make a point. Even the energy budget on the earth is massively greater than required to come to his answer.

But to impose locality as a necessary limitation on the SLoT is not only to introduce extraneous non-value added complications, but in fact it can subtly involve the SLoT in decisions it has no part in. The structure of space-time is an active field of research, and saying SLoT can only be applied locally would define part of the structure of space-time. Let’s let General Relativity and Quantum (and String Theory?) do their jobs of finding restrictions on the need for locality, free of unwarranted restrictions imposed by the SLoT.
 

chair

Well-known member
We know. The problem is (and the discussion is centred around) whether it is hiding an information component.

E-N-T-R-O-P-Y.

You do know what entropy is, right? You do know there is more than one kind, right?

No. Known (uninformed) physical processes can also lower local entropy.

But feel free to keep asking the same questions, chair. Perhaps I'll get frustrated and answer one incorrectly so you can continue to respond to anything but the point...

Stripe,

If you can clearly state what "the challenge" is, I will meet it. As it stands, it is vague, and I can only guess at what you mean.

You spell entropy very nicely, but it is not at all clear what you mean when you use the term. The term has a defined meaning in thermodynamics, and a defined meaning in information theory. What do you mean when you use the term?
 

chair

Well-known member
Bob, John

Bob, John

Bob, John,

You have spent a lot of time discussing whether Styer was clear enough in explaining his topic. A literary discussion, perhaps interesting to some, but I suspect that giving his essay a grade on clarity isn't what interests most people here.

Aren't there more substantive issues here?

Thanks

Chair
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
Stripe,

If you can clearly state what "the challenge" is, I will meet it. As it stands, it is vague, and I can only guess at what you mean.

You spell entropy very nicely, but it is not at all clear what you mean when you use the term. The term has a defined meaning in thermodynamics, and a defined meaning in information theory. What do you mean when you use the term?
Entropy

From Wikipedia, the free encyclopedia

In many branches of science, entropy is a measure of the disorder of a system. The concept of entropy is particularly notable as it is applied across physics, information theory and mathematics.
The word "entropy" is derived from the Greek εντροπία "a turning towards" (εν- "in" + τροπή "a turning").

Bob, John,

You have spent a lot of time discussing whether Styer was clear enough in explaining his topic. A literary discussion, perhaps interesting to some, but I suspect that giving his essay a grade on clarity isn't what interests most people here.

Aren't there more substantive issues here?

Thanks

Chair
The substantive issue is that the majority on both sides have misunderstood what the challenge to evolution from entropy is. Quit confusing matters.
 

ThePhy

New member
Chair asked:
Stripe,
If you can clearly state what "the challenge" is, I will meet it. As it stands, it is vague, and I can only guess at what you mean.

You spell entropy very nicely, but it is not at all clear what you mean when you use the term. The term has a defined meaning in thermodynamics, and a defined meaning in information theory. What do you mean when you use the term?
Stripe responded:
Entropy

From Wikipedia, the free encyclopedia

In many branches of science, entropy is a measure of the disorder of a system. The concept of entropy is particularly notable as it is applied across physics, information theory and mathematics.
The word "entropy" is derived from the Greek εντροπία "a turning towards" (εν- "in" + τροπή "a turning").

The substantive issue is that the majority on both sides have misunderstood what the challenge to evolution from entropy is. Quit confusing matters.
Chair, it is abundantly clear that your challenge is to root out every misinformed secular scientist, and every secular scientist who is not hyper-explicitly clear on what type of entropy is being discussed, every creationist without exception (except for maybe one), and re-educate the lot of them.

Or, alternatively, you can note the common use of the term across the disparate fields listed in the wiki article, and undertake to show that it has a causal connection across all of them. Show that a change in information entropy forces a change in thermo. Disregard that the wiki article makes no reference that there is a functional dependency between the various applications of the term. This rewriting the laws of science is for Stripe, who seems to be adverse to admitting that entropy is dealing with separate concepts as it is applied in different fields. Good luck.
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
Entropy deals with separate concepts as it is applied in different fields.

The challenge to evolution is that there is no known means by which sunlight, or any energy, can be turned into information without intelligent guidance.
 

ThePhy

New member
Entropy deals with separate concepts as it is applied in different fields.

The challenge to evolution is that there is no known means by which sunlight, or any energy, can be turned into information without intelligent guidance.
Outside what the Styer paper addresses.
 
Status
Not open for further replies.
Top