Discussion thread for Bob and Johnny's One on One

Status
Not open for further replies.

chair

Well-known member
Entropy deals with separate concepts as it is applied in different fields.

The challenge to evolution is that there is no known means by which sunlight, or any energy, can be turned into information without intelligent guidance.

Stripe- Do you accept that there is a thing sometimes called "micro-evolution"?
 

ThePhy

New member
The challenge to evolution is that there is no known means by which sunlight, or any energy, can be turned into information without intelligent guidance.
If God chose to, could He make a simple modification to DNA that adds information to it?
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
Outside what the Styer paper addresses.
Then you've conceded all of Johnny's points for him. You also agree with my instant response to LoL's original thread. Styer has not addressed the full and correct challenge to evolution from entropy.

Replication
I see. And would you mind sharing how it is that biological evolution ignores the trends imposed on everything else by entropy?

Stripe- Do you accept that there is a thing sometimes called "micro-evolution"?
I'll not use that term. Far too confusing. Populations and features follow trends that change over time. Those changes adhere to the principles of entropy in that a new feature always comes at a net cost to the population.

If God chose to, could He make a simple modification to DNA that adds information to it?
Yes.
 

Flipper

New member
Hey Bob Enyart,

Next time you're doing a real science friday, can you press Fred Williams to update the Arguments Creationists Should Not Use section on AIG to include an entry on the SLOT?

I bring this up because until this recent clarification on TOL, almost all challenges that I have seen presented by creationists regarding entropy have been formulated in regards to thermodynamic entropy.

It's nice to hear that AIG are now formally onboard with what evolutionists have been saying for years - thermodynamical entropy has nothing to do with whether evolution is possible or not. So they really should get the message out to the flock, don't you think? I mean, since they care so much about science and all.
 

chair

Well-known member
I'll not use that term. Far too confusing. Populations and features follow trends that change over time. Those changes adhere to the principles of entropy in that a new feature always comes at a net cost to the population.

Ah. Can you give an example of how this works?
 

ThePhy

New member
Then you've conceded all of Johnny's points for him. You also agree with my instant response to LoL's original thread. Styer has not addressed the full and correct challenge to evolution from entropy.
Just so we are not crossing paths with semantics, when you say that “Styer has not addressed the full and correct challenge to evolution from entropy”, I am going to presume you must be including information entropy, since no one I’ve seen is even pretending to counter him on thermodynamic entropy.

But since, as has been shown several times, Styer made it explicitly clear that he was addressing Thermodynamic entropy, and only Thermodynamic entropy, then you are correct that he has not covered the full range. He never intended to.

As to my upending Johnny, remember Johnny is the one-on-one participant. I am just on the sidelines, and what I say is not what decides the outcome.
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
Just so we are not crossing paths with semantics, when you say that “Styer has not addressed the full and correct challenge to evolution from entropy”, I am going to presume you must be including information entropy, since no one I’ve seen is even pretending to counter him on thermodynamic entropy.
The challenge is from entropy. Information and thermodynamics are two fields that utilise this observed trait that can be applied to all scientific fields.

But since, as has been shown several times, Styer made it explicitly clear that he was addressing Thermodynamic entropy, and only Thermodynamic entropy, then you are correct that he has not covered the full range. He never intended to.
Do you think he would be interested in addressing the challenge as it now stands?

As to my upending Johnny, remember Johnny is the one-on-one participant. I am just on the sidelines, and what I say is not what decides the outcome.
:chuckle: I do tend to lump you guys together a bit, don't I. Apologies. That was not my intent. Just overly strong emphasis on the point I wanted to make...
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
No, I haven't. OS please describe the changes in the population and what the cost was.
I think the changes wrought in Fijian natives is obvious if one assumes they originated in the Middle East...

I guess the cost is that they would be forced through a genetic bottleneck were a sample group of them ever transplanted to an environment that pushed for lighter skin. With all the genetic code for both light skin and dark skin it would be simple to adapt to one extreme by dropping off the information for the other. But a return to the original environment could never see them regain all that lost genetic code.

Not sure if my biological terminology is correct and I'm sure it's a bit more complex than that, but the simple point is that entropy ensures that there will be a cost. I can be certain it exists whether or not I have an idea of what it might be.

I can also be certain that if we are talking about genetics then the cost exists in a genetic medium rather than in sunlight.
 

Jukia

New member
Not sure if my biological terminology is correct and I'm sure it's a bit more complex than that, but the simple point is that entropy ensures that there will be a cost. I can be certain it exists whether or not I have an idea of what it might be.

A bit more complex than Stripe understands---ya think???
 

bybee

New member
very complex

very complex

A bit more complex than Stripe understands---ya think???

I wonder, as I wander, in the maze of entrophy and thermodynamics,Is someone saying that matter is disappearing, ceasing to exit? and, conversely, that matter is being created out of nothing to fill the space vacated by the matter which has been destroyed? I am, somewhere in left field! What? bybee
 

Stripe

Teenage Adaptive Ninja Turtle
LIFETIME MEMBER
Hall of Fame
I wonder, as I wander, in the maze of entrophy and thermodynamics,Is someone saying that matter is disappearing, ceasing to exit? and, conversely, that matter is being created out of nothing to fill the space vacated by the matter which has been destroyed? I am, somewhere in left field! What? bybee
Make it support evolution and your nobel is guaranteed. :thumb:

:chuckle:
 

ThePhy

New member
The challenge is from entropy. Information and thermodynamics are two fields that utilise this observed trait that can be applied to all scientific fields.
I don’t know what “this observed trait is”. If you are speaking of entropy, then it is not an observed trait common to both fields, anymore than showing fear (to quail) is the same as eating a type of bird called quail.
Do you think he would be interested in addressing the challenge as it now stands?
I have no idea where Styer’s future interests lie. E-mail him and ask.
I do tend to lump you guys together a bit, don't I. Apologies.
Apology accepted. Johnny reminds me of some of my colleagues, great guys who honor their faith yet are not afraid of doing honest science.
 
Last edited:

ThePhy

New member
Bob was KO’d - but is asking for a tie

Bob was KO’d - but is asking for a tie

Win or lose in this One-on-One debate should be on the debate subject. The problem is, the specific question to be decided is not declared external to the debate, but has to be inferred from the opening post.

After presenting some preliminaries, in referring to Styer’s paper Enyart asserts:
But the paper repeats an error that Henry Morris made fifty years ago,
which error Enyart goes on to say is that of conflating the two definitions of entropy. In his OP Enyart relies on claims by Timothy Stout, and then talks a bit more about entropy confusion. But nowhere in his OP, other than by saying it is so, did Enyart show that Styer mixed up the disparate definitions of entropy. That is the challenge of this debate – did Styer cheat by relying on two different concepts of entropy?.

Enyart has shown that the confusion over entropy exists in both the evolutionist community and the Creationist community. The thing that he has not done is to show that Styer was confused, or that Styer relied on that confusion in the larger community to establish his point.

Enyart has faulted Styer for not being hyper-explicit about saying that his use of the word “entropy” referred to thermodynamics. Yet, as Johnny showed, both in Styer’s opening sentence and continuing throughout the paper, he is speaking of Thermodynamics.

Styer’s analysis discrediting thermodynamic entropy as a barrier to evolution is untouched by Enyart’s repeated efforts to show that Stryer relied on a confused definition of entropy.

And note that in his One-on-One post of Dec 14 Enyart includes an offer for Johnny to concede in the title of his post, but in the body of his Dec 16 post he has lowered his sights and is now asking for a tie. Wonder why?
 

Dan Styer

New member
Notes on Entropy and Evolution

Notes on Entropy and Evolution

On 5 December 2008 Bob Enyart said that my paper on "Entropy and Evolution" claims that


evolution on earth can appear to violate the 2nd Law locally because a decrease in entropy as a squid evolves in the sea is offset by a fluctuation of entropy in a galaxy far, far away.​


In fact, I never said this, nor anything like it.

1. The phrase "to violate the 2nd Law [of thermodynamics] locally" has no meaning. The second law says that "heat flow is from high temperature to low temperature" -- the notion of "local" doesn't appear in the second law.

2. I have never in my life used the term "fluctuation of entropy" because I've never understood what it meant.

3. My paper shows that the decrease in entropy as a squid evolves in the sea can be offset by an increase in the entropy of the microwave background. The microwave background is not "far, far away" ... it's right here. We're immersed in it.

====================================

Bob Enyart goes on to say


Entropy is NOT a manifestation of the 2nd law of thermodynamics.

It is not.

The reverse is true.

The 2nd law is a manifestation of entropy.​


Notice that Bob just states this claim with no supporting evidence.

There are a number of different approaches to entropy, but the historical one is to begin with the second law -- "heat flow is from high temperature to low temperature" -- and from it derive the existence of entropy. (This derivation is long and subtle, and is perhaps the most beautiful piece of logic I've ever encountered. If you haven't seen it, I recommend Fermi's old and very clear book Thermodyamics.) As such, entropy is not a "manifestation" of the second law of thermodynamics but a consequence of it. However, the word "manifestation" is unclear here, so I'm not entirely sure what Bob means.

================================

Bob Enyart speaks long and hard about the difference between "heat entropy" and "information entropy". It is quite clear from context that by entropy I mean "thermodynamic/statistical mechanical entropy". A simple glance at the equations in my paper would have made that abundantly clear.

================================

Bob Enyart goes on to say


Entropy has to do with the move from order to disorder in any organized system, whether it is organized by energy states, ergonomics (arrangement of utensils in your kitchen, etc), aesthetic values, information content, etc.​


This is very false. My paper "Insight into Entropy" (also in American Journal of Physics) is devoted to overturning this misconception. Frank Lambert has also devoted considerable energy (in the non-physics sense of the word!) to the same end. See

http://www.entropysite.com/

(By the way, Bob criticizes me severely for not distinguishing between "heat entropy" and "information entropy", but in the passage quoted above he does exactly the same thing!)
 

Dan Styer

New member
What does entropy mean?

What does entropy mean?

Radio announcer Bob Enyart takes me to task for not distinguishing between "heat entropy" and "information entropy" in my American Journal of Physics article "Entropy and Evolution". Any knowledgeable person could just look at the equations in my paper and see that I mean "thermodynamic/statistical mechanical entropy".

The word "entropy", like most words, has many meanings, and the meaning in use is determined from context. If I say "Run away from danger", you don't think "A run is a small stream, so I must follow a small stream away from danger".

Here I want to present some of the other meanings of the word "entropy", to emphasize that it would have been silly to say that I'm not talking about each of them:

information entropy

topological entropy

Kolmogorov entropy

Kolmogorov-Sinai entropy

metric entropy

Gibbs entropy

Boltzmann entropy

Tsallis entropy

von Neumann entropy

Shannon entropy

Rényi entropy

volume entropy

If I spend so much time talking about what I'm not going to be talking about, the paper would have been quite long indeed!
 

bybee

New member
names

names

Radio announcer Bob Enyart takes me to task for not distinguishing between "heat entropy" and "information entropy" in my American Journal of Physics article "Entropy and Evolution". Any knowledgeable person could just look at the equations in my paper and see that I mean "thermodynamic/statistical mechanical entropy".

The word "entropy", like most words, has many meanings, and the meaning in use is determined from context. If I say "Run away from danger", you don't think "A run is a small stream, so I must follow a small stream away from danger".

Here I want to present some of the other meanings of the word "entropy", to emphasize that it would have been silly to say that I'm not talking about each of them:

information entropy

topological entropy

Kolmogorov entropy

Kolmogorov-Sinai entropy

metric entropy

Gibbs entropy

Boltzmann entropy

Tsallis entropy

von Neumann entropy

Shannon entropy

Rényi entropy

volume entropy

If I spend so much time talking about what I'm not going to be talking about, the paper would have been quite long indeed!

Sounds sort of like denominations to me.....So many split hairs. way beyond me. bybee
 
Status
Not open for further replies.
Top