## Control volumes

Discussions on the philosophical foundations, assumptions, and implications of science, including the natural sciences.

### Control volumes

If you work in any of the sciences or engineering, you’re probably already familiar with the concept of breaking something down into parts and analyzing them separately. Generally this is done using computers running software in order to model some complex physical behavior. Unfortunately the concept, even though widely used, is not well defined from a philosophical perspective.

The concept I have in mind is the basis of programs called “finite element analysis” in many engineering fields. When the concept is applied to fluids, the approach is called “computational fluid dynamics”. When applied to neurons we have “compartment models” and when applied to solar systems, galaxies and universes we get “n-body simulations”. When different types of analysis are done on the same model such as stresses on solids with fluid interactions, temperature and thermodynamic changes, electromagnetic affects, etc... the analysis is called a multiphysics analysis or something along those lines. One can use this multiphysics software for instance, to model Rayleigh Benard convection where heat transfer, fluid properties and the Navier Stokes equations all have to be combined to model the phenomenon.

The basic idea is that we model the system needing to be analyzed in 3 dimensions by producing a model of the system in a computer. Once a computer model is created, physical properties are assigned and the model is broken down into numerous, small parts or volumes. For analysis of stresses and strains in materials such as bridges or aircraft wings, the software will break up a solid object into nodes by producing a mesh that delineates specific volumes of solid material. Fluid dynamics is similar in that the fluid is broken up into volumes by producing a mesh. Compartment models of neurons also break up single neurons into small volumes (ie: compartments) but they model each compartment using an equivalent electrical circuit made of resistances, inductances and capacitences. Galaxies are a bit different, the models might use individual solar systems, stars or other large masses as their ‘volumes’ of space. Gravity including the affects of relativity can be modeled. In these models, time steps are often used and interactions can be found to propogate through the model depending on the length of the time step.

So the objective of this thread is to better define the more general concept. I’d like to discuss the basis for this kind of analysis - why do so many computer programs do it this way? Why break something down into parts and how are those parts treated in relation to the whole? Does this concept of breaking something down into parts reflect how nature works or is there something more to it? We often here the slogan, “The whole is more than the sum of the parts.” However, the use of computer programs to break things down like this seems to challange those claims. If there is something more and we know what that something more is, then shouldn’t these programs be reflecting in some way, what that something more is?

Dave_C
Member

Posts: 329
Joined: 08 Jun 2014
Location: Allentown

### Re: Control volumes

I’m going to push this along a bit and see if there’s any interest. Unfortunately, this is the boring part. The scarey stuff has to have a basis and a language so this thread is primarily for definitions. If there are no takers, that’s fine. I realize much of this can seem pedantic, so my appologies for that.

This thread is intended to further develop the concepts of separability and weak emergence by creating definitions around these concepts. Bedau uses the term “micro” and “macro” for weak emergence, but those terms aren’t well defined and as mentioned, they don’t apply to nonseparable phenomena. Furthermore, engineering and science don’t generally use those terms, so I’d like to suggest an alternate set of terms that are commonly used in mechanical engineering, even if not in other areas. I had considered using terms such as “finite element” as used in computer modeling, but my intent isn’t to consider how those models work, rather my intent is to consider why.

The concept I’d like to introduce is based on the concept of breaking apart a physical volume of space into smaller volumes which are still large enough such that quantum physics does not play a role between their interactions. For lack of a better basis, I’ll use the concept of control volumes. The initial use of control volumes for fluid analysis is largely credited to Ludwig Prandtl at the beginning of the 20'th century according to Vincenti. When first introduced, the control volume was "an imagined spatial volume having certain characteristics and introduced for purposes of analysis" of fluid mechanics. The concept was later expanded to aid in thermodynamic analysis, heat and mass transfer and other types of analysis. Below are a few definitions:

1. Control Volume: Control volumes are typically defined as a region of three dimensional space selected for the purposes of analysis and on which classical level physical laws can be applied. The control volume concept is only applied to fluid mechanics or thermodynamics and only applied to analytical situations, although similar concepts are applied throughout the sciences. There are both “differential formulations” and “finite control volume formulations”, the difference being that differential equations are applied to the differential formulation such that the fluid obeys physical laws at all points within the volume. In comparison, the finite control volume formulation considers all mass within a control volume to have the same properties including velocity and density for example. The finite control volume concept lends itself well to numerical analysis but the differential control volume might lend itself better to philosophical discussions since we might imagine then having a system with no discrete points that could identify the boundaries of the control volume. The concept of a differential control volume is essentially the same as continuum mechanics in which physical properties are spread out uniformly over some volume of space.

2. Control Surface: Control volumes are surrounded by a control surface. That boundary is an imaginary, 2 dimensional surface, like the thin skin of a balloon. The purpose is to identify the surface around a control volume which can be used to locate and identify any causal relationship which might have an influence on the volume. The control volume therefore is only influenced by what crosses its control surface and local fields. Similarly, a control volume can only affect its surroundings by operating across the control surface or producing fields which propagate locally and at a rate defined by the field’s rate of propogation.

3. Causal Action: The term "causal action" is not used for the control volume concept, and is being added to refine the concept further. We need a term to define those macroscopic interactions that are made up of the more fundamental physical interactions between atoms and molecules. For interactions where a large enough agregate of molecular interactions can be averaged out, we end up with classical physics. Although the term “causal action” might be self explanatory, it really needs some sort of definition.

Causal actions regard those interactions which are for example, modeled using various finite element methods such as normal and shear stresses in solids and fluids, momentum and energy transfer, heat transfer and fields such as gravitational and EM fields.

All causal actions considered for the control volume concept must act at a classical level. There must be a large number of particles creating the causal action such that interactions can be averaged over this aggregate. Interactions at the molecular level are not causal actions as the term will be used here. Here are a few considerations when defining causal actions:

- Causal actions act on or across a control surface. As such, they are local.
- Causal actions may or may not have an influence on what is inside a control volume. EM fields for example, might not interact with wooden structures, whereas fluid momentum will, but both can be considered causal actions.
- Causal actions are created by matter with knowable states or properties.
- Causal actions are applied to the control surface at the time of the local interaction.
- Causal actions propagate through space at a velocity which is dependent on the means of propagation.

4. Control Mechanism: The term "control mechanism" is another new term being introduced to the control volume concept. A control mechanism is made up of a set of control volumes, so a control mechanism is nothing more than a set of control volumes. The volumes should all be adjacent to one another but the CM may take on any arbitrary shape.

That’s about it. I think what’s important is why we should expect to simulate classical physics using finite or differential control volumes. We do this because that seems to be how the world works at the level of tea cups and car tires. Interactions occur locally, not nonlocally. Where large numbers of molecules can produce some measurable property, we can utilize that property and we can characterize the interaction at the local level. We don’t need and we wouldn’t expect to find nonlocal interactions where phenomena can be described by classical physics. If however, a phenomenon can not be described by classical physics, then we won’t be able to attribute that phenomenon to local interactions between parts or between control volumes.

Vincenti, W. G. (1990). What engineers know and how they know it: Analytical studies from aeronautical history.

Dave_C
Member

Posts: 329
Joined: 08 Jun 2014
Location: Allentown

### Re: Control volumes

Just to be pedantic ...

- Causal actions propagate through space at a velocity which is dependent on the means of propagation.

A field does not have a velocity.

You mention neurons briefly yet want to stay with classic mechanical physics. If it is pertinent, where does the phenomenon of phenomenon fit in? (where. and how (if at all?), does conscoiusness fit into one of these models?

Have you studied much about Gestalt Theory?

Resident Member

Posts: 5606
Joined: 14 Mar 2012

### Re: Control volumes

You may find DaveC's emergence thread, next door, of some help regarding how consciousness relates to this.

TheVat

Posts: 7221
Joined: 21 Jan 2014
Location: Black Hills

### Re: Applied Science vs. Understanding

Dave_C wrote:The concept I have in mind is... called "finite element analysis" in many engineering fields.

Engineers apply science. There are jobs that need to be done and engineers need only know that which is necessary to complete them. They most often do a spectacular job of it.

However despite their utility, tools such as finite element analysis are an admission of ignorance. A finite element can be made arbitrarily small yet retain "state variables" which are clearly emergent phenomena. This is a conceptual problem in that, a finite element can be too small to accommodate a statistically adequate population of particles for "smooth" state variables such as pressure, temperature, etc.

Bottom line: Particles don't use finite element analysis. If we don't understand a phenomenon at the particle level, our understanding is incomplete. Terms such as volume, pressure, and temperature as well as phases (solid, liquid, gas) don't even apply at the particle level.

That's not to say that I would advise an engineer to abandon finite element analysis (he or she's got work to do), but I would advise it to a physicist.

Dave_C wrote:Causal actions act on or across a control surface. As such, they are local.

Even a concept as apparently simple as "locality" is not yet understood.* Thus, the invention of "fields", which are also miserably ill conceived. Nonetheless, you've done a good job identifying the problem.

*Just how does one particle interact with another?

Active Member

Posts: 1835
Joined: 10 Oct 2012
Location: Times Square (T2)

### Re: Control volumes

Step One: don't ever use the word "particle." It seems convenient, but it just makes endless trouble. :-)

TheVat

Posts: 7221
Joined: 21 Jan 2014
Location: Black Hills

### Re: Control volumes

BadgerJelly » June 18th, 2015, 2:19 am wrote:A field does not have a velocity.

You mention neurons briefly yet want to stay with classic mechanical physics. If it is pertinent, where does the phenomenon of phenomenon fit in? (where. and how (if at all?), does conscoiusness fit into one of these models?

Have you studied much about Gestalt Theory?

Hi Badger. I guess we wouldn't say the field has velocity, but any perturbation in the field has a velocity. Wikipedia defines a field, "In physics, a field is a physical quantity that has a value for each point in space and time. For example, on a weather map, the surface wind velocity is described by assigning a vector to each point on a map. Each vector represents the speed and direction of the movement of air at that point."
https://en.wikipedia.org/wiki/Field_(physics)

Regarding consciousness, what I've seen is that people typically talk past each other. It's like the tower of Babel. I'm trying to avoid that by creating the language before tackling the difficult bits. And no, I'm not familiar with Gestalt Theory.

Dave_C
Member

Posts: 329
Joined: 08 Jun 2014
Location: Allentown

### Re: Applied Science vs. Understanding

Hi Dave,
Faradave » June 18th, 2015, 10:53 am wrote:However despite their utility, tools such as finite element analysis are an admission of ignorance. A finite element can be made arbitrarily small yet retain "state variables" which are clearly emergent phenomena. This is a conceptual problem in that, a finite element can be too small to accommodate a statistically adequate population of particles for "smooth" state variables such as pressure, temperature, etc.

I don't disagree. At some point, classical physics breaks down.

Bottom line: Particles don't use finite element analysis. If we don't understand a phenomenon at the particle level, our understanding is incomplete. Terms such as volume, pressure, and temperature as well as phases (solid, liquid, gas) don't even apply at the particle level.

I don't disagree. Knowing how the interactions between particles proceed is a much deeper understanding of nature than understanding it at the level of classical physics. However, the point of bringing up classical physics is that there's a widely accepted understanding that things like computers, neurons, control volumes of water or solid or whatever, interact not because of some specific quantum interactions between the parts (ex: between neurons or between computer chips), but because the phenomenon in question is classical in nature. Suggesting for example, that we must understand how neurons interact at the quantum mechanical level implies that the quantum mechanical level is important to how those neurons interact. Christof Koch wrote an interesting paper that addresses this:

Although brains obey quantum mechanics, they do not seem to exploit any of its special features. ...

Two key biophysical operations underlie information processing in the brain: chemical transmission across the synaptic cleft, and the generation of action potentials. These both involve thousands of ions and neurotransmitter molecules, coupled by diffusion or by the membrane potential that extends across tens of micrometers. Both processes will destroy any coherent quantum states. Thus, spiking neurons can only receive and send classical, rather than quantum information. ...

The description by Koch I think is widely understood by neuroscientists and I think we should fully accept it. His point is that neurons spike because of large numbers of ions and other molecules acting on them. What isn’t explicitly stated but we can certainly take as implied is that any similar large number of ions and other molecules would do the same. There is nothing peculiar about those molecules acting on a particular neuron. Any equivalent group that acted in the same way will suffice. We can say this because it is only the measurable properties of those molecules which, when acting together, produce a sufficient influence on the neuron that cause it to spike. I won’t claim to be an expert on neuron interactions but I think it’s relatively clear why the interactions between neurons are considered “classical” in nature.

If neuron interactions are classical in nature, then we should be able to find that neurons are separable, which I believe is the case. Neurons are experimented with in vitro in an attempt to understand how they react in vivo. That isn’t to suggest that we can at this point in history, duplicate everything that occurs to a neuron and make it literally undergo the same physical changes in state in vitro that occur in vivo. But IFF we were capable of doing so THEN we should expect the neuron to undergo (essentially) identical changes in state. I say essentially only because we shouldn’t expect to be able to duplicate all atoms and molecules exactly, so some variation can be expected. Classical physics only deals with the overall, gross properties. Just as important, we should not expect there are other, nonlocal causal influences on the neuron. This follows from the separability of classical physics.

Koch, C., & Hepp, K. (2006). Quantum mechanics in the brain. Nature

Best regards.

Dave_C
Member

Posts: 329
Joined: 08 Jun 2014
Location: Allentown

### Re: Control volumes

I'm willing to concede that at the present time, it does not appear that the brain is making use of subtle quantum phenomena in other than aggregate (i.e. classical) levels. There doesn't seem to be a biological role for qubits or other aspects of quantum computing. So, if "control volumes" help, go for it.

Dave_C wrote:Why break something down into parts and how are those parts treated in relation to the whole? Does this concept of breaking something down into parts reflect how nature works or is there something more to it?

I think it makes intuitive sense because it appears that it is how things are made. For all the attention given to the act of conception, it may escape our attention that the real work of reproduction is eating. Food is consumed and broken down into components, which are then reassembled into a human with, among other things, a brain. This bottom up manufacturing process suggests a similar path to understanding.

If any supernatural process is involved, it would seem hopeless to extract an understanding of it from within natural limitations. Science tries its best on the assumption that this is not the case.

Active Member

Posts: 1835
Joined: 10 Oct 2012
Location: Times Square (T2)

### Re: Control volumes

I'm not sure I can fully follow this discussion. However, I'll throw in my 2 cents.
My impression is that finite element analysis is particularly useful because it helps the student to define:
1) what properties of a system can be derived from the properties of the constituent elements
2) what properties of a system can be derived from the interactions of the constituent elements
3) what properties are not reducible (i.e. cannot be derived in the above manner)
4) what properties of a system can be reproduced by a finite element computational model though they could not be derived from the above: this is a crucial aspect because the model can help identify the mechanism of emergence of such properties
5) what properties of the system cannot be reproduced by computational models - as detailed and precise as you wish - and therefore call for some theoretical / physical aspect that has not been taken into consideration but must be involved in generating unexpected interactions or emergent phenomena at the higher level of the system
6) whether there is a way of forcing the finite element model into the actually observed behavior, so that possible mechanisms for the observed emergent phenomena can be hypothesized and studied.

Finite element analysis with modern computers is much more powerful than any analytic procedure in pinpointing irregularities and unexpected behaviors, especially when a fully analytical description of a system is beyond our capabilities.

neuro
Forum Moderator

Posts: 2624
Joined: 25 Jun 2010
Location: italy

### Re: Control volumes

Hi neuro. Thanks for the comments. Feel free to point out what you feel is ambiguous – I'm very interested in what doesn't come across well and what does.

I wonder why you used the word 'student' above? Perhaps you don't mean student in the literal sense or perhaps there's a translation issue from Italian? Sorry, but it seemed like an odd word to use and I wonder if you meant something more by it.

I like your comments about deriving higher level properties from the lower level, but it isn't clear to me what you mean by emergence. The earlier thread I created on emergence looks at weak and strong emergence, so I wonder if you are concerned that some higher level can't be deduced from its lower level because of downward causation (ie: the higher level is strongly emergent) or do you mean weak emergence?
viewtopic.php?f=10&t=28722

I'll back up just a bit. The intent of this thread is two fold.
1. Examine how science utilizes known physical laws which act at some low (micro-level) but still classical level to determine higher level phenomena.
2. Provide a vocabulary to discuss this common conception of how nature works.

The concept of breaking something down into smaller parts and examining how those parts interact so that the higher level assembly can be understood using a computer is a concept that only dates back to roughly the 1950's. The first codes were used for stress analysis of aeronautical flight hardware. It was only then we had computers powerful enough to do the tremendous number of calculations needed. I can sympathize with your concern regarding, “ … what properties of the system cannot be reproduced by computational models … “ Even today, and even for completely static systems, we often can't accurately portray a simple stress analysis of a heavily loaded component. It's completely normal for a stress analysis done using FEA to give stress values that would result in cracking of the material while the actual material doesn't. But I know the reason for this is that the codes don't accurately simulate the nonlinearities of the material properties. The stress/strain of the material and the resulting deformation above what's called the “yield stress” of the material, isn't accurately set up in most computer FEA programs.

There are a number of reasons FEA doesn't work perfectly accurately, though you will find some amazingly accurate predictions. Some reasons they don't work well are:
1. Nonlinearities of physical properties aren't set up well or aren't well known.
2. The programs simply can't cope with those nonlinearities even if they had them.
3. The equations for physical interactions are not as accurate as they need to be. I've heard of problems for example with hypersonic velocities and the resulting air/solid interaction.
4. The amount of computer resources necessary to produce the resolution required is problematic, hence the need for computational power above even what is available today. The Blue Brain project is a perfect example.
5. Other problems.... there are lots of them.

There are a large number of problems with FEA and they are worth discussing, but I would like to suggest that for any analysis that requires the interactions of large numbers of atoms or molecules such that the phenomenon sought does not make use of any of the special features of quantum mechanics, that analysis does not and should not consider non-local interactions. They should only consider interactions between neighboring elements. The problem with non-local interactions is they will invariably lead to violations of conservation laws. And that's the same problem we have with downward causation and strong emergence. Even if one believes that those type of phenomena exist, I think it is still worth taking an unbiased look to see what philosophical predictions can be made when one assumes that classical physics is both local and separable.

The concept of a “control volume” dates back to the early 1900's so I've decided to use the vocabulary from this earlier effort. I also use the control volume language because FEA language doesn't have the breadth to discuss this properly.

Just to emphasize one very basic issue, there is both a “differential formulation” and “finite formulation” of control volumes, the difference being that differential equations are applied to the differential formulation such that physical laws are obeyed at all points within the volume. In comparison, the finite formulation considers all mass within a control volume to have the same properties including velocity and density for example.
- Finite control volume: All properties are the same and there are no variations of physical properties within a control volume. This is a type of 'digital' world best used to avoid those incalculable differential equations necessary for continuum mechanics.
-Differential control volume: Properties within the control volume vary so as to reflect the variations actually seen in a real volume of space.

This “finite control volume” concept is the basic concept around FEA. All the properties inside the volume are given the same value, even though they actually don't have the same value in real life. FEA is debatably the best example of how one might simulate physics using a computer to analyze classical scale phenomena but I want to steer away from the idea of merely simulating a physical system using numerical methods. What is needed is a more general, conceptual framework around which one can discuss a physical system and the phenomena which might arise from the aggregate of atoms and molecules in numerous small volumes of space. So the concept I want to use going forward is that of a differential control volume.

Best regards.

Dave_C
Member

Posts: 329
Joined: 08 Jun 2014
Location: Allentown

### Re: Control volumes

Dave_C » Sun Jun 14, 2015 12:57 pm wrote:So the objective of this thread is to better define the more general concept. I’d like to discuss the basis for this kind of analysis - why do so many computer programs do it this way?
Because it works and it is relatively straightforward to apply.

Dave_C » Sun Jun 14, 2015 12:57 pm wrote:Why break something down into parts
Because it is possible to determine the behaviour of those parts by using suitable equations to analyse the behaviour of the part. In short, it works.

Dave_C » Sun Jun 14, 2015 12:57 pm wrote: Does this concept of breaking something down into parts reflect how nature works or is there something more to it?
It is generally a good model for how nature works. I'm not sure we ever go beyond models in science, and almost certainly never in engineering.

Dave_C » Sun Jun 14, 2015 12:57 pm wrote: We often here the slogan, “The whole is more than the sum of the parts.” However, the use of computer programs to break things down like this seems to challange those claims. If there is something more and we know what that something more is, then shouldn’t these programs be reflecting in some way, what that something more is?
I can input the dimensions of the elements of a drill string into a FEA program. OD, ID, length, Young's modulus, etc. I can then determine how each element reacts to excitation, including the excitation generated by adjacent elements.

The something more, that is the sum of the parts, is discernment of the conditions under which the drill string will experience resonance. i.e. in the general case the "something more" is the output of the program.
Eclogite
Forum Moderator

Posts: 1362
Joined: 07 Feb 2007

### Re: Control volumes

Dave_C » June 22nd, 2015, 3:03 am wrote:I wonder why you used the word 'student' above? Perhaps you don't mean student in the literal sense or perhaps there's a translation issue from Italian? Sorry, but it seemed like an odd word to use and I wonder if you meant something more by it.

You're right. I tried to translate the Italian word "studioso", which would probably be something more like "scholar" than "student".
I wonder if you are concerned that some higher level can't be deduced from its lower level because of downward causation (ie: the higher level is strongly emergent) or do you mean weak emergence?
viewtopic.php?f=10&t=28722

Well, this is the point I'm concerned with.
I cannot exclude telepathy, or that metaphysical energies justify the effects of acupuncture. I simply stay with what can be explained (possibly scientifically) and look for new perspectives to explain (possibly scientifically) what cannot currently be explained.

My impression is that “strong emergence" is usually advocated when a higher-level system cannot be "reduced" to its lower-level, well defined, components. Downward causation sounds to me like the above metaphysical aspects: we have no proof to exclude them, but the scientific approach is to become able to neglect such a kind of explanations because we have an alternative rational, physical, testable explanation.

In a sense, strong emergence requires that further laws, fully independent of the lower level, be introduced to explain the behavior of a complex system. I find it fascinating that, in several cases, general theoretical frameworks have been sufficient to fill such gaps: for example, probability theory has provided the theoretical justification for the emergence of entropy and the arrow of time in macroscopic systems, thus providing a "reduction" of thermodynamics to molecular kinetics.
It's completely normal for a stress analysis done using FEA to give stress values that would result in cracking of the material while the actual material doesn't. But I know the reason for this is that the codes don't accurately simulate the nonlinearities of the material properties. The stress/strain of the material and the resulting deformation above what's called the “yield stress” of the material, isn't accurately set up in most computer FEA programs.
There are a number of reasons FEA doesn't work perfectly accurately, though you will find some amazingly accurate predictions. Some reasons they don't work well are:
1. Nonlinearities of physical properties aren't set up well or aren't well known.
2. The programs simply can't cope with those nonlinearities even if they had them.
3. The equations for physical interactions are not as accurate as they need to be. I've heard of problems for example with hypersonic velocities and the resulting air/solid interaction.
4. The amount of computer resources necessary to produce the resolution required is problematic, hence the need for computational power above even what is available today. The Blue Brain project is a perfect example.
5. Other problems.... there are lots of them.

see what I mean?

What appears to be "emergence" may often be the result of a gap in stepping from the lower to the higher level, due to the fact that some reasonable and theoretically solid phenomenon or process (independent from the laws of the lower level system) intervenes in generating unpredicted properties of the higher level system.

My impression is that, in a sense, emergence is a corollary of Goedel theorem, in that the logic that fully and consistently describes an elementary system cannot account also for its interactions with other systems. So, if you consider this as a demonstration of the impossibility of strict reductionism, then Science itself has demonstrated that strict reductionism is theoretically untenable.

Still, Science has been able to account for many instances of strong emergence (and will possibly account for many more), thereby suggesting that a more general form of reductionism can actually be pursued. In general, the scientific community proceeds by phenomenologically investigating and separately describing the micro-system(s) and the macro-system to ever greater detail, as though accepting the impossibility of a "reduction", until some groundbreaking intuition opens the way to understand the mechanism(s) of the strong emergent aspects that “segregate” the two levels.

Sometimes, the intuitions don't even need to be so groundbreaking...
1. Nonlinearities of physical properties aren't set up well
...
3. The equations for physical interactions are not as accurate as they need to be.

Then we get to the very interesting aspect:
Just to emphasize one very basic issue, there is both a “differential formulation” and “finite formulation” of control volumes, the difference being that differential equations are applied to the differential formulation such that physical laws are obeyed at all points within the volume. In comparison, the finite formulation considers all mass within a control volume to have the same properties including velocity and density for example.

My impression is that once more the problem arises at the boundary.
A finite control volume (or finite element) model will presumably be adequate if the approximation gives rise to the same (qualitatively and quantitatively) relations between neighboring "cells".

In Biology we can perfectly account for quite complex interactions through apparently unjustified, but clearly incredibly robust, assumptions and approximations. One such example is the "constant field assumption" that describes the flow of ions across a biological membrane channel. Such assumption makes it possible to analytically integrate the chemical and electric potentials across the membrane and predict ion flows (and currents) as functions of ion concentrations at the two sides of the membrane and electric potential difference across the membrane. And it works incredibly well!
Actually, you do encounter curious problems with very high ion concentrations or interactions between different ion species, and you have to go down to molecular interactions, but then it is easy to realize that under such conditions the constant-field approximation is not tenable (you have to reduce the scale of the finite control volume).

So, my impression is that finite control volumes can work well if the resulting boundary interactions are well reproduced. And if the resulting models do not describe well the higher level system, then that is nice, because it will signal to us that we are missing something. Which may well be some kind of downward causation, but most often appears to be some aspect we did not take into account.

Sorry if all this is rather unrigorous.

neuro
Forum Moderator

Posts: 2624
Joined: 25 Jun 2010
Location: italy

### Re: Control volumes

Eclogite, neuro, thanks for the comments.

I wonder if you would agree that (at least one) reason so many different areas of science resort to the 'Lego block' type analysis of FEA is that it reflects how nature works at the classical scale? Specifically, if some phenomenon can be described as 'classical' in the sense that Koch uses the concept, the causal interactions between those small volumes of space (finite elements) occur only with neighboring elements and not with other nonlocal elements. If I use the terminology presented, that statement becomes something like: If some phenomenon can be described as classical, then the causal actions between control volumes operate only through the control surfaces of neighboring control volumes and never through the control surfaces of nonlocal control volumes.

Dave_C
Member

Posts: 329
Joined: 08 Jun 2014
Location: Allentown