HomeUncategorizedDebate on Science Theory

Debate on Science Theory

Apropos our conversation a few weeks ago, here is a short essay Gopi, one of the main Physicists developing the Reciprocal System, wrote about astronomy.  He is younger than the other guys– just finished his PHD in physics.  I like him because he has a broader interest in philosophy and the history of science: http://reciprocalsystem.org/PDFa/Replacing%20the%20Foundations%20of%20Astronomy%20(Vijaya,%20Gopi%20Krishna).pdf

In the essay he compares the development of Newtonian dynamics with the epicycles of the previous age of astronomy and connects the holistic approach that I am advocating to you with not only Steiner but Hegel, who I was interested to learn had similar critiques of Newton that were later developed by the counter tradition that Steiner and Larson have carried further.

Also if you are interested here is Gopi lecturing on the history of mathematics in physics.  He goes into some interesting mathematical examples here that expand on my point to you about the way the “progression” of knowledge can be seen as examples of building postulates on top of erroneous postulates to keep up with the advance of observational data.  Basically epicycles.  Not only in astronomy of course, but it is easiest to see there.  Here Gopi discusses the path that was taken that hides the basic ratios involved with the strategy of postulating forces and constants.  The reciprocal system derives everything clearly from basic ratios.  It makes so much more sense.

The video is about a half an hour and then some questions. I think it connects the reciprocal system ideas well with what Arthur Young was trying to understand with his obsession with the third derivative.

http://reciprocalsystem.org/video/physics-history
Adam,

I’m almost in the clear, but I want to reply to your earlier message before replying to this one.

I hope to complete my reply to your earlier message tomorrow.

My comments are embedded below.

Regards,

—–Original Message—–
From: Adam Pogioli <satyavan8@gmail.com>

Apropos our conversation a few weeks ago, here is a short essay Gopi, one of the main Physicists developing the Reciprocal System, wrote about astronomy.  He is younger than the other guys– just finished his PHD in physics.  I like him because he has a broader interest in philosophy and the history of science: http://reciprocalsystem.org/PDFa/Replacing%20the%20Foundations%20of%20Astronomy%20(Vijaya,%20Gopi%20Krishna).pdf

In the essay he compares the development of Newtonian dynamics with the epicycles of the previous age of astronomy and connects the holistic approach that I am advocating to you.

Science, for me, as for many others, means a body of models that are practically useful because they fit experience/experiment better than any other models available at their time of acceptance — though they ARE likely to be superseded, to at least partially lose that acceptance within the community of science practitioners, and even within the community of the public at large, as experience/experiment advances and deepens — and also science means selecting the body of models that are found to predict, more fitly than other extant models, the experiences/experiments that are produced later, after those models have delivered their predictions for those experiences/experiments.

Such science thus has a strong, albeit not perfect, tendency to self-correct and to improve/progress over time.  Such a science is useful for technology advancement, for advancing the material life and the ‘meta-Darwinian fitness’ of the human species — for expanding the human-social “forces of [human-social re-]production“.

Science as such cannot claim absolute or “metaphysical” truth, nor can it be science if it is based upon metaphysical or spiritualistic postulates that have not been validated in communal human experience/experiment, let alone that have been falsified repeatedly in such experience/experiment, and, moreover, let alone philosophical postulates adopted simply because they sound more interesting, or are more in [transitory] philosophical fashion.

Newtonian physics is still enabling engineers to get space probes to the outer planets, on time and on place; to the loci targeted at the time targeted.

General Relativity, which provides a better fit to situations of extreme velocity/acceleration, and/or of extreme gravitational field intensity, does not predict enough difference is these outcomes, four our space probes, so far, to justify the greater computational time, computer memory-space, and human effort that applying it would require.

However, if you want a science — in the sense defined above — theory that instantiates a holism that also yields the best fit to, at least, macro-cosmic experience to-date, then you ought to be looking at the General Relativity model produced by Einstein.  Our GPS systems only work as accurately as they do because of a General Relativistic calculation that corrects their “clocks” from the classical-Newtonian predictions.

Einstein sought to implement Mach’s principle in his General Relativity theory — roughly that inertial mass is not a local/atomistic/”internal” property of each individual body of mass, but, on the contrary, is entirely the joint product of the gravitational interaction of all bodies upon each such body, itself included, so that, as the distribution of bodies of mass in the cosmos changes, the gravitic force felt by each individual body changes, which changes its movement, hence its contribution to the total cosmological gravitic interaction/field, which further drives the change(s) of the whole cosmological distribution of mass bodies.  Thus, to that extent, the mass body distribution/pattern of movement of the total cosmos is self-determining, self-changing, self-evolving, self-developing.  And the motion/positions of each individual mass body is determined by the whole, itself included.

My understanding is that Einstein’s 10 “simultaneous” nonlinear partial differential equations — or the single tensor equation — that embod(y)(ies) his General Relativity model, did not fully implement Mach’s principle.

For example, when those equations are used to model a space-only, mass-less cosmos — after all, a “counterfactual condition” — they still do not yield an absolutely uncurved — flat — Euclidean geomtry for that space.

But Einstein came damn close to that holistic model.  And its predictions are fit to an amazing degree of precision — far surpassing any other known model for the macrocosmos.

with not only Steiner but Hegel, who I was interested to learn had similar critiques of Newton that were later developed by the counter tradition that Steiner and Larson have carried further.  I do not see Hegel, let alone Steiner, as authorities when it comes to science as defined above — unless they have produced models that I don’t know about that fit experience better than do those of, e.g., Newton and Einstein.  Nor do I see authority per se as the core principle of science.  I see Hegel as potentially contributing advanced heuristics for human inquiring and problem-solving thought, and for the systematic presentation of the fruits of that inquiry, iff one can scour away the dross of his metaphysics, and get to the metaphysics-free core of his heuristic thought algorithm.

Also if you are interested here is Gopi lecturing on the history of mathematics in physics.  He goes into some interesting mathematical examples here that expand on my point to you about the way the “progression” of knowledge can be seen as examples of building postulates on top of erroneous postulates to keep up with the advance of observational data.  Basically epicycles.  Not only in astronomy of course, but it is easiest to see there.  Here Gopi discusses the path that was taken that hides the basic ratios involved with the strategy of postulating forces and constants.  The reciprocal system derives everything clearly from basic ratios.  It makes so much more sense.  “Sense” in what sense?  Does it fit past experience/experiment better than Newton’s models, better than Einsteins?  Does it make more accurate predictions?  And what good is it if it doesn’t?

Without the discipline of empirical fitness, all that you get from such theorizing is the best that philosophy can give — an ever-growing swamp of bickering and competing “schools” of thought, with, in principle, no rational method of deciding among them, of picking the best models, except for “philosophical taste”, and “philosophical fashion”, both of which are notoriously fickle and notoriously varying from person to person.  That is a no practical progress situation!

The video is about a half an hour and then some questions. I think it connects the reciprocal system ideas well with what Arthur Young was trying to understand with his obsession with the third derivative.

http://reciprocalsystem.org/video/physics-history

Adam Pogioli <satyavan8@gmail.com>

8/8/17

Thanks for your comments.  Sorry to keep pestering you with these things you are not interested in, but you often say you are interested when I speak to you in person, so I get the wrong impression.

I understand time is short and you probably want to focus on your own work and not get into material that might be worthless.  I can respect that for sure.  In that case it would be up to me to persuade you of the value of any ideas that I think could help your work before I could expect you to invest much time in them.

Though I am not sure why you assume up front that whatever I am studying is only, to quote the very misguided Sokal’s example of scientistic dismissal of philosophers, “fashionable nonsense”.  Which is basically what I got from your paragraph on metaphysics and “spiritual” notions above and other comments in previous exchanges. I am not sure what you based any of these comments on since they don’t seem to be addressed to anything I have sent you recently besides my own short and obviously generic comments on why I was sending you these things.   I do see why you wrote the comment on Hegel’s limitations; I only mentioned him to get you interested, not to reason by authority.

If you watch the video I think you will see an interesting discussion of the history of mathematics in physics.  The content certainly differs from your interpretation of the value of Newton and Einstein, but it is much less speculation on the metaphysical/teleological meaning of those thinkers than your own work.

I guess I want to understand this impasse since my impression of your work reflects what you seem to assume of my ideas.  It seems like you launch off from scientific ideas into a vast symbolic extrapolation that has spiritual meaning for you and hopefully others.  From what little I understand, after I believe a good if unsuccessful effort a few years back, you make teleological predictions (which are very much part of the metaphysical tradition not the purely scientific–even contemporary philosophy eschews teleology as too metaphysical), and have developed heuristics for the proliferation of categories of classification.  This fits more with the goals of some of early 20th century philosophy, not science.

I am sure you have done much more; I am sure there are many things within your algebras that go deep into the meaning of mathematics and other properly scientific subjects–for philosophical concepts, experiment, speculation, extrapolation, math, rhetoric… these things are never pure and isolated as the book you gave me on Newton Maxwell and Marx so brilliantly demonstrates.  All the more so with people like us interested in a broader science of life and mind.  But I am mostly concerned with your ideas on mathematics and what you might make of what other alternative thinkers you have a lot in common with are doing.  We started talking about the reciprocal system again recently because your speculations on dark matter and energy are very similar to Larson’s. However, Larson’s concept of scalar motion and the researchers that have followed him with a more sophisticated topological understanding, are not just speculations but logical deductions from his basic premise of a universe of motion grounded in abstract change.  The deductions cascade through all of physics and beyond.  Inward and outward scalar motions all fall out of the mathematics of postulating everything as relative distributions within reference frames of correlative displacements from unity.

At least that is my characterization which I don’t think of as just one “theory” but an extension of ideas that emerge out of the whole character of 20th century scientific thinking when thought through as a whole and not just in isolated fields and problems.  Increasingly since the 19th century, with Riemann,Poincare, etc. we have been struggling to come to grips with the greater abstract “space” of all possible rules, shapes and transformations.  Increasingly it seems obvious that space and time are not primary but ways of visualizing and serializing change, which at the level of absolute magnitudes is a basic ratio of displacement from unity, or a break in a fundamental symmetry that is prior to any concept of space.  Zero and infinity are byproducts of the reference frames we use, not part of the fundamental ratios.

There is much more we could discuss but I am not the best at the technical details, which is why I refer to those that are.  If you don’t get around to checking out the video, maybe you will want to discuss my understanding of this stuff in person when we meet up next month.  I do appreciate your honest feedback and friendship and I hope we can find some common ground in the future.  I get the feeling you have insights into the meaning of mathematics and its use for modelling the world and can help me make sense out of all this along with others I think are on similar tracks. Larson’s system isn’t some alternative theory starting from scratch with no relation to experiment.  It is the product of thinking through the results of the history of scientific experient and trying to understand what has been found not in terms of a historical necessity, not taking whatever interpretation was decided upon historically and extrapolating from there, (which seems to be what you do, in which case it is clearly Hegelian metaphysics), but instead finding the structure of what is possible, of the many ways things can develop, of the myriad forms of combination that result from different assumptions and choosing the interpretations that not only fit the data but create a meaningful system.  The systems approach to science is what has actually been producing all the new insights and technology while it has been repressed out of physics for interesting reasons.  I do not think relativity qualifies as a system’s understanding because, even if it was an attempt at unity, it is not coherent, that is, it is contradictory at its core.

So, while Larson made some successful predictions with his systems approach to physics, specifically with pulsars and quasars that later came true, one could say his few postulates from which everything derives, predict most of known physics as logical possibilities and extrapolations.  But to answer your questions more directly:

The standard model of physics and its many offshoots fits data very well because that is what it is designed to do.  It is not a meaningful theory and has no logical consistency.  No one is claiming it is all rubbish.  Most of the math is approximately correct because it finds a good fit and that can lead to extrapolations and predictions, sometimes many of them, sometimes successfully, and other times not so much.  But as I said the other day, general relativity is mostly as Freeman Dyson put it famously “totally inert”. Yes it connected a few things, enough for a few successes, but mostly it sits there totally disconnected from useful physics mucking up astronomy and cosmology.

In contrast Larson’s model connects everything in physics its researchers have found over half a century (after some developmental adjustments) with no need for the arbitrary constants, proliferating assumptions, and fudge factors which have become the standard practice of physics.  Larson’s thought has made successful predictions but its strength is in consolidating what we know of mainstream science and in the vast arena it opens up for the future that we are only beginning to understand.  The contemporary physics scene is a practical dead end.  Billions spent on experiments every year with little to show because they have the wrong lense.  The vast data pouring in from these experiments has little practical value because it has no meaningful context, just heuristics, just proliferating categories, no real hegelian synthesis, no real understanding.

Meanwhile all the real physics is being done in the black budget world because they have the right understanding.  The recognized important experimental results which defied what they thought they knew, and because they changed their theory they were able to redirect experiments towards the important areas that the alternative scene has been busy trying to figure out for themselves. The military stumbled upon things they couldn’t understand and rather than explain it away with ad hoc categories, they tested and revamped their theories until a new world opened up for them.  That world is where the action is at and it’s where the reciprocal system points–to what Larson called the cosmic sector, faster than light motion…it is not a metaphysical realm in the sense that it is beyond measurement, beyond physics (motion).  On the contrary, so much of the post-classical (Quantum)experimental data is meaningless without being aware of this reciprocal of normal space/time motion, because it is the very nature of energy (time/space).  But everything is motion, or more precisely, ratio, change; fundamentally change is abstract and gets measured through reference systems.  Metaphysics, which I have no problem with, (it is, like you, what I do a lot of–interpreting the ultimate meaning and trajectory of this motion), is not what I need to discuss with you.

For Larson, who was not much of a metaphysical guy but a shrewd engineer, the cosmic sector and everything in his theory, everything that people have found experimentally, falls out of his basic insights with no need for all the metaphysical entities you believe in that are necessary to explain a system of forces that doesn’t meaningfully connect, and that are imagined to exist behind the recorded data.  In truth, everything is just distributions of ratios of space to time(motion) and it all falls out of that.
—–Original Message—–Hi Adam,

Responses embedded below.

Regards,

From: Adam Pogioli <satyavan8@gmail.com>

Sent: Tue, Aug 8, 2017 1:33 am
Subject: Re: epicycles

Thanks for your comments.  Sorry to keep pestering you with these things you are not interested in, but you often say you are interested when I speak to you in person, so I get the wrong impression.

am interested, but my schedule is demanding, and I can’t rightly invest too much time in theories that are essentially philosophical — that do not qualify as “new science”, because they do not make predictions that exhibit a superior match to recent experience/experiment/data than does “old science”.  I am happy with the amount of time I invested in the links you provided in your original message in this stack, and I wanted to give you my review of their content, which I did.  I did not mean to assert that such philosophical theories are of no use as philosophy, only that they are not of use as science, if they do not pass the test of better experiential/experimental prediction.  As philosophy, they may provide perspective, a feeling of unified understanding of matters which science separates, an aesthetically-pleasing conceptualization of experience, etc., which represent values in their own right.

I understand time is short and you probably want to focus on your own work and not get into material that might be worthless.  I can respect that for sure.  In that case it would be up to me to persuade you of the value of any ideas that I think could help your work before I could expect you to invest much time in them.

Yes!
Though I am not sure why you assume up front that whatever I am studying is only, to quote the very misguided Sokal’s example of scientistic dismissal of philosophers, “fashionable nonsense”.

I do not think that what you are studying is nonsense, only that it is ‘nonscience‘.  When no arbiter of superior predictive power operates, taste, if not fashion, inevitably becomes involved in each persons choice among alternative theories, or among alternative schools of thought.

Which is basically what I got from your paragraph on metaphysics and “spiritual” notions above and other comments in previous exchanges. I am not sure what you based any of these comments on since they don’t seem to be addressed to anything I have sent you recently besides my own short and obviously generic comments on why I was sending you these things.

I do see why you wrote the comment on Hegel’s limitations; I only mentioned him to get you interested, not to reason by authority.  I apologize if I misunderstood your meaning.  However, without the arbiter of superior predictive power, reasoning by authority, and academic authoritarianism, become more probable.

If you watch the video I think you will see an interesting discussion of the history of mathematics in physics.

The content certainly differs from your interpretation of the value of Newton and Einstein, but it is much less speculation on the metaphysical/teleological meaning of those thinkers than your own work.  I am not aware of teleological assumptions, either in the work of Newton, or of Einstein, or in my own work.  Aristotelian teleology assumes the operation of “final causes”, e.g., located somehow “in the future”, and impacting, “somehow, the present.  Teleology in general presumes a purposive total cosmos, which pursues ends in a somehow conscious manner.  Note that, e.g., the limit-cycle attractor, or the fixed point attractor, which is the predicted long term asymptotic “destiny” of an unperturbed dynamical system, for a given “basin of attraction”, as long as its state stays within the boundaries of that “basin”, per the dynamical differential equation that describes and defines that dynamical system, is not a case of teleology, or of “final causes”.  Each present moment of such a dynamical system causally increments its convergence towards its attractor.  Over time, the cumulation of past-presents’ causes of that convergence accumulates to a present moment when the system-state is closer to a state within that attractor set-of-states than any pre-assigned state-distance, no matter how small, baring external perturbations.  The ‘internal perturbations’ of that dynamical system are such that the attractor set-of-states is its inherent, immanent tendency, even if the state of the system never reaches zero-distance from states in that state-set in all finite time. 

I guess I want to understand this impasse since my impression of your work reflects what you seem to assume of my ideas.  It seems like you launch off from scientific ideas into a vast symbolic extrapolation that has spiritual meaning for you and hopefully others.  From what little I understand, after I believe a good if unsuccessful effort a few years back, you make teleological predictions (which are very much part of the metaphysical tradition not the purely scientific–even contemporary philosophy eschews teleology as too metaphysical), and have developed heuristics for the proliferation of categories of classification.  This fits more with the goals of some of early 20th century philosophy, not science.

When confined to the categorial combinatorics limitations of the Q“purely”-qualitative dialectical categorial algebra, only qualitative predictions are possible.  And those predictions are intimated by iterating the categorial equations beyond the categories that the user decides represent the present ontological state of the universe of discourse and Domain that she/he is mapping, to generate combinations and self-combinations of those categories that the user recognizes as “present”, i.e., to combinations and self-combinations of those present categories that the user recognizes as not presently extant, at least not fully extant, though they may already be “fractionally” manifest in the present.  In my experience, as I think I illustrated for you for a ‘dialectical meta-model’ presentation of the contemporary written English language, there are always alternative interpretations/solutions of the meanings of the initially “algebraic-unknown” categorial combinations, generated in each iteration, and the interpretation/solution selected is a matter of choice on the part of the user.  Thus, the Q dialectical method’ is an ‘algorithmic-heuristic’ method.  But it is not teleological, for the reasons given above.  It is extrapolative.  The method is based upon the observation that ‘categorial combinatorics and self-combinatorics’ well-describe the ontological content generated by the actual cosmos in the past, and that therefore we might form the expectation — albeit with no absolute guarantee that such an expectation will be met — that such categorial combinatorics and self-combinatorics will continue into the future, out to some typically as yet unknown finite limit.

I am sure you have done much more; I am sure there are many things within your algebras that go deep into the meaning of mathematics and other properly scientific subjects–for philosophical concepts, experiment, speculation, extrapolation, math, rhetoric… these things are never pure and isolated as the book you gave me on Newton Maxwell and Marx so brilliantly demonstrates.  All the more so with people like us interested in a broader science of life and mind.  But I am mostly concerned with your ideas on mathematics and what you might make of what other alternative thinkers you have a lot in common with are doing.

We started talking about the reciprocal system again recently because your speculations on dark matter and energy are very similar to Larson’s. However, Larson’s concept of scalar motion and the researchers that have followed him with a more sophisticated topological understanding, are not just speculations but logical deductions from his basic premise of a universe of motion grounded in abstract change.  The deductions cascade through all of physics and beyond.  Inward and outward scalar motions all fall out of the mathematics of postulating everything as relative distributions within reference frames of correlative displacements from unity.

Even the most formal-logically rigorous deduction, though it can help to refine a scientific theory — as was the case with Einstein’s Special Theory of Relativity, as well as with Newton’s Principia theory of “The System of the World” — is no guarantee of the scientific, empirical applicability or accuracy of the theorems so deduced.  A deduction is no better, in terms of empirical truth, than are its premises, its assumptions, its postulates and axioms, its definitions, its primitives and its rules of inference.  It is perfectly possible to concoct any number of ‘logically rigorous phantasies’, that contradict experimental results and human experience in any number of ways.  Thus, human experience and experimental, instrumental data, extending that experience, remain the crucial test of scientific theories, as opposed to of philosophical speculations.  Deduction is inherently speculative.  Only the disciplining of the postulates to accord with experiential, practical truth [which, as such, is never final, or exempt from supersession] can facilitate scientific theory-making.  and, even then, deduction does not guarantee predictive superiority, and predictive superiority is the arbiter of theory acceptance in science, as distinct from philosophy.

At least that is my characterization which I don’t think of as just one “theory” but an extension of ideas that emerge out of the whole character of 20th century scientific thinking when thought through as a whole and not just in isolated fields and problems.  Increasingly since the 19th century, with Riemann,Poincare , etc. we have been struggling to come to grips with the greater abstract “space” of all possible rules, shapes and transformations.  Increasingly it seems obvious that space and time are not primary but ways of visualizing and serializing change, which at the level of absolute magnitudes is a basic ratio of displacement from unity, or a break in a fundamental symmetry that is prior to any concept of space.  Zero and infinity are byproducts of the reference frames we use, not part of the fundamental ratios.  I agree with you in suspecting that physical space is not a fundamental objective aspect of the cosmos, and that, especially, “infinity” is a concept that accrues to ‘logically-rigourous phantasy’, at best, and not to the description of human experiential/experimental reality.  However, with regard to time, I suspect that Smolin and Unger are right:  I hold that time [i.e., self-change and other-induced change] — but not our conventional measurements of it — constitute a fundamental aspect of physical reality — perhaps the most fundamental.  Perhaps time was the “first born” of the cosmos, the oldest actuality that we have so far even partly fathomed — “dark energy”, as the oft accelerated expansion of apparent physical space, as “pure time”.

There is much more we could discuss but I am not the best at the technical details, which is why I refer to those that are.  If you don’t get around to checking out the video, maybe you will want to discuss my understanding of this stuff in person when we meet up next month.  Yes, I look forward to that opportunity.

I do appreciate your honest feedback and friendship and I hope we can find some common ground in the future.  I, too, appreciate your friendship, and viewpoint, and the fruitful dialogue that we have developed.

I get the feeling you have insights into the meaning of mathematics and its use for modelling the world and can help me make sense out of all this along with others I think are on similar tracks. Larson’s system isn’t some alternative theory starting from scratch with no relation to experiment.  It is the product of thinking through the results of the history of scientific experiment and trying to understand what has been found not in terms of a historical necessity, not taking whatever interpretation was decided upon historically and extrapolating from there, (which seems to be what you do, in which case it is clearly Hegelian metaphysics), but instead finding the structure of what is possible, of the many ways things can develop, of the myriad forms of combination that result from different assumptions and choosing the interpretations that not only fit the data but create a meaningful system.  No human being can escape the ‘human phenome’ that raised them, even though many think they can.  Therefore, my approach isHegelian in the sense of the method of immanent critique of all received theories [and including the continued immanent critique of my own theories].  My approach is non-Hegelian in that it is non-metaphysical — I do not indulge in ungrounded speculations, but limit myself to forming theories which fit the present qualitative, ontological experience/data about the past.  In this, my approach is similar to that of Karl Marx. My interpretation or solution of the ”’algebraic unknown”’ category symbols that are generated by the self-reflexion, self-operation, self-aufheben, or “self-squaring” of the categorial expression of the present ontological state of history — i.e., the formation of predictions — does involve speculation about the future, but this speculation is speculation grounded in the meaning of the already solved-for category-symbols which describe the past ontological state of history.  Mathematics and science is opportunistic — whatever the mathematical paradigms and physical theories that succeed locally are generally adopted by mathematicians and scientists, without much initial concern about their incongruity vis-a-vis other mathematical axioms and axioms systems and physical theories, adopted for other localities, or about the coherence of the whole theoretical content of mathematics and of physical science.  Philosophy [of science, and of mathematics] can help clean up this mess, if it locates the potentials for congruity and coherence of the totality of present mathematical and scientific theory, without diminishing modeling efficacy and predictive power.

The systems approach to science is what has actually been producing all the new insights and technology while it has been repressed out of physics for interesting reasons.  I’d like to hear more about this from you when we meet for our annual September dinner!  

I do not think relativity qualifies as a system’s understanding because, even if it was an attempt at unity, it is not coherent, that is, it is contradictory at its core.  Not sure what you mean here — General Relativity does lead to division-by-zero singularities at the extremes of mass-energy self-concentration and self-densification, and also when it attempts to describe the Domain presently theorized by Quantum Mechanics.

So, while Larson made some successful predictions with his systems approach to physics, specifically with pulsars and quasars that later came true — this should be the most prominent feature of his work, if it is to qualify as science, as practical truth for the present, and not just as philosophy  one could say his few postulates from which everything derives, predict most of known physics as logical possibilities and extrapolations.

But to answer your questions more directly:

The standard model of physics and its many offshoots fits data very well because that is what it is designed to do.  And, in my, as well as in the views of many others, rightly so.  If it is not designed to do so, it is not science.

It is not a meaningful theory and has no logical consistency.  I do not see what you mean by “meaningful theory” here.  Newton’s and Einstein’s theories have, by Newton and Einstein themselves, and by others, been formulated in the form of highly rigorous formal-logical theories.

No one is claiming it is all rubbish.  Most of the math is approximately correct because it finds a good fit and that can lead to extrapolations and predictions, sometimes many of them, sometimes successfully, and other times not so much.  But as I said the other day, general relativity is mostly as Freeman Dyson put it famously “totally inert”. Yes it connected a few things, enough for a few successes, but mostly it sits there totally disconnected from useful physics mucking up astronomy and cosmology.  The failure to link General Relativity and Quantum Mechanics is a reflection of the limitations of both theories, as well as of the underdevelopment of mathematical and physical theory in general, at present.  As far as scientific success, the General Theory of Relativity, for the prediction of the dynamics of the macrocosmos, is presently unequaled and unsurpassed.

In contrast Larson’s model connects everything in physics its researchers have found over half a century (after some developmental adjustments) with no need for the arbitrary constants, proliferating assumptions, and fudge factors which have become the standard practice of physics.

Larson’s thought has made successful predictions — that is the key to any claims for itself as science — but its strength — its strength as philosophy — is in consolidating what we know of mainstream science

and in the vast arena it opens up for the future that we are only beginning to understand — that is only a hopeful and, presently, merely speculative claim, not yet substantiated as science.

The contemporary physics scene is a practical dead end.  Billions spent on experiments every year with little to show because they have the wrong lense.  The vast data pouring in from these experiments has little practical value because it has no meaningful context, just heuristics, just proliferating categories, no real hegelian synthesis, no real understanding.   That is an extraordinary claim, which would seem to be falsified by much evidence — e.g., the detection of what we presently call “dark energy” and “dark matter”, of the accelerated self-expansion of the cosmos, of gravitational waves predicted by the General Theory of Relativity, etc., etc.  To substantiate such a claim, one would have to offer a new scientific theory that out-predicted all of the presently existing older scientific theories.  It is not my understanding that Larson has accomplished this.

Meanwhile all the real physics is being done in the black budget world because they have the right understanding.  Right understanding, in science, can be proven only by superior predictive power.  The recognized important experimental results which defied what they thought they knew, and because they changed their theory they were able to redirect experiments towards the important areas that the alternative scene has been busy trying to figure out for themselves. The military stumbled upon things they couldn’t understand and rather than explain it away with ad hoc categories, they tested and revamped their theories until a new world opened up for them.  Can you substantiate this claim?

That world is where the action is at and it’s where the reciprocal system points–to what Larson called the cosmic sector, faster than light motion…it is not a metaphysical realm in the sense that it is beyond measurement, beyond physics (motion).  On the contrary, so much of the post-classical (Quantum) experimental data is meaningless without being aware of this reciprocal of normal space/time motion, because it is the very nature of energy (time/space).  But everything is motion, or more precisely, ratio, change; fundamentally change is abstract and gets measured through reference systems.  Metaphysics, which I have no problem with, (it is, like you, what I do a lot of–interpreting the ultimate meaning and trajectory of this motion), is not what I need to discuss with you.  These claims can only be persuasive, scientifically, if the theories described produce models better fit to human experience/experimental data.

For Larson, who was not much of a metaphysical guy but a shrewd engineer, the cosmic sector and everything in his theory, everything that people have found experimentally, falls out of his basic insights with no need for all the metaphysical entities you believe in — I’m not clear about what beliefs you are attributing to me here — that are necessary to explain a system of forces that doesn’t meaningfully connect – I’m not sure what “doesn’t meaningfully connect” means here — , and that are imagined to exist behind the recorded data — this “behindism” should be accepted only until a scientifically superior theory, requiring less positing of such ”’noumena”’, arrives.  In truth, everything is just distributions of ratios of space to time(motion) and it all falls out of that.  Again, this assertion can be scientifically persuasive only if the theories deduced from this assertion lead to scientifically superior — predictively and ‘retrodictively’ superior — theories.  Otherwise, we are back to metaphysical speculation and dogmatic assertion all over again — back to the “Middle Ages”.

Adam Pogioli <satyavan8@gmail.com>

8/25/17
Thanks for the detailed response.  I hope I am not distracting you too much with this discussion.  We can talk about this stuff more when we get together next month. For now I will try and approach what seems like a mess of misunderstandings. Most of them seem once again to hinge on your belief that what you are doing is somehow science and what I talk about is philosophy.

You say: “My approach is non-Hegelian in that it is non-metaphysical — I do not indulge in ungrounded speculations, but limit myself to forming theories which fit the present qualitative, ontological experience/data about the past.  In this, my approach is similar to that of Karl Marx”

You form theories that fit the data we have.  That description fits everybody I read.  What real thinker would argue with facts? Of course it has been a huge part of science deciding what the facts actually are beyond the literal and obvious recording of such and such measurement.  What makes you a scientist? You make predictions? Aren’t your predictions basically speculation on future states of society? Every philosopher does that. But if this is the role you see for the philosophy of science :  Philosophy [of science, and of mathematics] can help clean up this mess(incoherent theory), if it locates the potentials for congruity and coherence of the totality of present mathematical and scientific theory, without diminishing modeling efficacy and predictive power

then I see why you would think you are less of a philosopher, but then neither are you doing an immanent critique as you claim, but instead are merely extrapolating from that mess.  You are however, clearly doing theoretical research which may have great value as it has all through the history of science.  You don’t seemed too concerned with questioning basic physical assumptions, but you are altering them and redefining them, which is what most of the people I read and talk about are doing.  Call it whatever you want.  The semantics of field divisions isn’t worth discussing.  I think we are both doing more general theory work than the field specific people that are considered scientists, though you definitely are more interested in mathematical theory and seem to be doing original research.

In any case the point is that the ideas I pitch to you do fit the data, and fit it much better than the normal interpretations.  This is partly because physicists have increasingly abandoned meaning. Back in Einstein and Bohr’s day they would pour over the data and spend a good portion of their time discussing and debating what the data meant.  When they couldn’t really understand what the quantum data was telling them, Bohr’s complementarity set the pattern for the future that the meaning didn’t matter.  Bohm at least tried.  I think Feynman was bothered but had no solution.  Most physicists have been happy with positing new metaphysical entities with magical properties called forces that do whatever they need them to do with no ground or connection to deeper principles that would carry the discovery over into further extrapolations and discoveries.

That you say this about the practise of adding new assumptions for every theoretical problem just to fit the data blows my mind:

And, in my, as well as in the views of many others, rightly so.  If it is not designed to do so, it is not science.

I hope you just mean here that the theory shouldn’t contradict the data, but if that is what you mean, I have to wonder about so many assumptions you must be making about things I am saying.  I am critiquing ad hoc theorizing.  It doesn’t explain anything, open up new research, or even make predictions beyond the formal details of what was already known. All it does is formalize what the data says with arbitrary constructs.  This is what the epicycle theory did.  If this is science we would have never got Newton or Maxwell or Einstein.  I thought this is what we were discussing.

The progress that has happened in physics, since now it is forbidden to have any new revolutions like those that have defined the discipline, has mostly been due to increasingly powerful instruments that usually serve to disprove previous theories rather than confirm them.  But rather than use that data to spur a new revolution in understanding, new metaphysical entities like “dark matter” are invented and the game goes on.  This is bad metaphysics and ends up being bad science because it does not understand what it predicts or finds, and therefore limits its value.

The rest of science has progressed much better than physics precisely because they had to do real science, which means real thinking, as opposed to just improving techniques to refine data and extrapolating mathematics to no end, the “shut up and calculate paradigm”.  As you know, when the systems are just too complex to solve completely, which is almost every real system, you can’t just calculate, you have to think, you have to make decisions about meaning. And because a lot of smart people have been thinking about complex systems for decades now, thinking and testing, testing and thinking, there are definite truths about the way things work that emerge, though there is much room for different interpretations.

Philosophy as a specific field can come in and help us choose the better interpretations, but there is no sharp division between philosophy and theory.  Philosophy tends to be more general but even strictly experimental scientists have to do some theoretical thinking.  Contemporary philosophy often goes by the name “Theory” for that reason because all the best thought is interdisciplinary. The sharper you make the distinction between science and philosophy, the more it sounds like the medieval “scientists”, claiming the epicycles are proven merely because they model observation, and any other theory that models the data with a new system is heresy.  We truly are at a similar crossroads.  The epicycle theory fit the data well.  Feyerabend has even argued that in some sense the facts were on the side of the Church and they had good reason to distrust the new theory and even refuse to look in Galileo’s telescope.  Feyerabend’s whole “philosophy” by which I mean his analysis of science history,  is a challenge to this notion that science progresses by facts.  Facts are meaningless without interpretation, and different interpretations come and go and often return in new forms, depending on sociological factors.

What often happens is the wrong meaning gets smuggled in without much thought.  And by wrong I mean limited.  As Feyerabend says “there is no idea, no matter how ancient or absurd that can’t add to our knowledge”.  Bad ideas add very little but it isn’t always clear which ones are bad, especially if the context for understanding them is not there.  Which is why older abandoned theories often come back in new forms.  Sometimes an idea or model can explain a little corner of existence, but the better ideas connect that corner with other things, illuminating them and carrying the mind further to new truths and experiments.

 

But I feel like this is not what this disagreement is about. I mean you gave me Thomas K Simpson.  You seem well read in the scientific literature.  You know the importance of theory.  It is what you do.  You are not an experimental scientist, except in the realm of ideas and mathematics.  The two of Simpson’s books I have are all about how the great theorists like Maxwell used mathematics along with rhetoric to illustrate new theoretical ideas that change our relationship to the data and the world.  You have to know this, so what is this disagreement about? It seems to boil down to your belief that whatever I am interested in may have value you admit, but it must be purely speculative and aesthetic and have no real importance on scientific progress.  I admit I have had similar thoughts of your work, but I took the time to address the ideas as best I could.  And I still would never say that with any certainty because I haven’t really engaged the ideas in any depth.

It seems as if you come to your conclusions because you assume if any of the alternative science stuff was true people would have realized it so you don’t have to actually consider it seriously.  I have directed you to evidence many times but you have never taken it seriously because I think you expect some singular experimental event that everyone recognizes as significant right away.  But this is the way it always is despite the physics mythology.  Michelson/Morley is presented as some key moment that proved the ether theory dead and set up the need for relativity.  But just what it means is still debated and some people still argue it only applies to some of the then current ether theories.  It is considered a turning point because of many factors that are sociological and not just a clear example of evidence breaking theoretical continuity.

Evidence for my views abounds just as in hindsight we can see how the evidence pointed to new conceptions in key points in physics, but it is never obvious to everyone.  In the past, when there was more respect for conceptual coherence, scientists were hungry enough for new fundamental theories if the old ones didn’t work.  So much so that relativity was given a hearing, despite its radicality and its many problems, because it made some sense, and people cared about sense.  Nowadays it is much different for new radical theories.  As long as they merely add to the structure and don’t challenge any dogma they can be as weird and ridiculous as the worst of the alternative scene.

But that is different.

You must see that the current paradigm inhibits any new change in view.  I am reading about Maxwell’s revolution with the field concept right now.  It would have never flied today.  No one wants a new view of the universe.  Every new discovery that points to these alternative models I study, or opens up the space for new ones is ignored or neutralized. The significance is not seen because it is easier to explain it away with a new assumption or force than it is to really understand and follow its significance.  Even if the incoherence is so stark they have no explanation, no one notices or cares. (I am reading about the solar eclipse effect on a Foucault pendulum right now, something easily explained with the systems physics, specifically Kozyrev’s model).

Anyways, I feel you have tried to be respectful and I appreciate that, but I think you have the same problem with me I had with you.  Neither one of us is convinced of the value of the other’s theory enough to delve into much of the specifics.  I admit that; though it must be stated that I don’t really have the tools to understand the specifics of your philosophy without more preparation.

 

 

Hi Adam,

Responses embedded below.

Regards,

—–Original Message—–
From: Adam Pogioli <satyavan8@gmail.com>

Sent: Tue, Aug 8, 2017 1:33 am
Subject: Re: epicycles

Thanks for your comments.  Sorry to keep pestering you with these things you are not interested in, but you often say you are interested when I speak to you in person, so I get the wrong impression.

am interested, but my schedule is demanding, and I can’t rightly invest too much time in theories that are essentially philosophical — that do not qualify as “new science”, because they do not make predictions that exhibit a superior match to recent experience/experiment/data than does “old science”.  I am happy with the amount of time I invested in the links you provided in your original message in this stack, and I wanted to give you my review of their content, which I did.  I did not mean to assert that such philosophical theories are of no use as philosophy, only that they are not of use as science, if they do not pass the test of better experiential/experimental prediction.  As philosophy, they may provide perspective, a feeling of unified understanding of matters which science separates, an aesthetically-pleasing conceptualization of experience, etc., which represent values in their own right.

 

I understand time is short and you probably want to focus on your own work and not get into material that might be worthless.  I can respect that for sure.  In that case it would be up to me to persuade you of the value of any ideas that I think could help your work before I could expect you to invest much time in them.

Yes!
Though I am not sure why you assume up front that whatever I am studying is only, to quote the very misguided Sokal’s example of scientistic dismissal of philosophers, “fashionable nonsense”.

I do not think that what you are studying is nonsense, only that it is ‘nonscience‘.  When no arbiter of superior predictive power operates, taste, if not fashion, inevitably becomes involved in each persons choice among alternative theories, or among alternative schools of thought.

Which is basically what I got from your paragraph on metaphysics and “spiritual” notions above and other comments in previous exchanges. I am not sure what you based any of these comments on since they don’t seem to be addressed to anything I have sent you recently besides my own short and obviously generic comments on why I was sending you these things.

I do see why you wrote the comment on Hegel’s limitations; I only mentioned him to get you interested, not to reason by authority.  I apologize if I misunderstood your meaning.  However, without the arbiter of superior predictive power, reasoning by authority, and academic authoritarianism, become more probable.

I wasn’t arguing at all, I was trying to get you interested enough to look into the RS so that we can discuss the ideas themselves rather than whether they have any value based on your misconceptions that I am having a hard time straightening out.  I thought seeing the RS as part of a tradition that includes Hegel and I think current dynamical systems theory might make you interested.

If you watch the video I think you will see an interesting discussion of the history of mathematics in physics.

The content certainly differs from your interpretation of the value of Newton and Einstein, but it is much less speculation on the metaphysical/teleological meaning of those thinkers than your own work.  I am not aware of teleological assumptions, either in the work of Newton, or of Einstein, or in my own work.  Aristotelian teleology assumes the operation of “final causes”, e.g., located somehow “in the future”, and impacting, “somehow, the present.  Teleology in general presumes a purposive total cosmos, which pursues ends in a somehow conscious manner.  Note that, e.g., the limit-cycle attractor, or the fixed point attractor, which is the predicted long term asymptotic “destiny” of an unperturbed dynamical system, for a given “basin of attraction”, as long as its state stays within the boundaries of that “basin”, per the dynamical differential equation that describes and defines that dynamical system, is not a case of teleology, or of “final causes”.  Each present moment of such a dynamical system causally increments its convergence towards its attractor.  Over time, the cumulation of past-presents’ causes of that convergence accumulates to a present moment when the system-state is closer to a state within that attractor set-of-states than any pre-assigned state-distance, no matter how small, baring external perturbations.  The ‘internal perturbations’ of that dynamical system are such that the attractor set-of-states is its inherent, immanent tendency, even if the state of the system never reaches zero-distance from states in that state-set in all finite time. 

I agree that this dynamical systems approach is different than traditional teleology and it is an approach that the social scientists I follow use as well.  That doesn’t change the fact that using the paradigm of “predictive science” as it is used in science for specific data sets and applying it to very general and ontological speculations on the scale that you do would be considered scientific posturing by most scientists.  And I think they are right.  What you are doing is philosophical speculation, albeit in the format of analytical philosophy with its penchant for logical formality.

 

I guess I want to understand this impasse since my impression of your work reflects what you seem to assume of my ideas.  It seems like you launch off from scientific ideas into a vast symbolic extrapolation that has spiritual meaning for you and hopefully others.  From what little I understand, after I believe a good if unsuccessful effort a few years back, you make teleological predictions (which are very much part of the metaphysical tradition not the purely scientific–even contemporary philosophy eschews teleology as too metaphysical), and have developed heuristics for the proliferation of categories of classification.  This fits more with the goals of some of early 20th century philosophy, not science.

When confined to the categorial combinatorics limitations of the Q“purely”-qualitative dialectical categorial algebra, only qualitative predictions are possible.  And those predictions are intimated by iterating the categorial equations beyond the categories that the user decides represent the present ontological state of the universe of discourse and Domain that she/he is mapping, to generate combinations and self-combinations of those categories that the user recognizes as “present”, i.e., to combinations and self-combinations of those present categories that the user recognizes as not presently extant, at least not fully extant, though they may already be “fractionally” manifest in the present.  In my experience, as I think I illustrated for you for a ‘dialectical meta-model’ presentation of the contemporary written English language, there are always alternative interpretations/solutions of the meanings of the initially “algebraic-unknown” categorial combinations, generated in each iteration, and the interpretation/solution selected is a matter of choice on the part of the user.  Thus, the Q dialectical method’ is an ‘algorithmic-heuristic’ method.  But it is not teleological, for the reasons given above.  It is extrapolative.  The method is based upon the observation that ‘categorial combinatorics and self-combinatorics’ well-describe the ontological content generated by the actual cosmos in the past, and that therefore we might form the expectation — albeit with no absolute guarantee that such an expectation will be met — that such categorial combinatorics and self-combinatorics will continue into the future, out to some typically as yet unknown finite limit.

I am sure you have done much more; I am sure there are many things within your algebras that go deep into the meaning of mathematics and other properly scientific subjects–for philosophical concepts, experiment, speculation, extrapolation, math, rhetoric… these things are never pure and isolated as the book you gave me on Newton Maxwell and Marx so brilliantly demonstrates.  All the more so with people like us interested in a broader science of life and mind.  But I am mostly concerned with your ideas on mathematics and what you might make of what other alternative thinkers you have a lot in common with are doing.

 

We started talking about the reciprocal system again recently because your speculations on dark matter and energy are very similar to Larson’s. However, Larson’s concept of scalar motion and the researchers that have followed him with a more sophisticated topological understanding, are not just speculations but logical deductions from his basic premise of a universe of motion grounded in abstract change.  The deductions cascade through all of physics and beyond.  Inward and outward scalar motions all fall out of the mathematics of postulating everything as relative distributions within reference frames of correlative displacements from unity.

Even the most formal-logically rigorous deduction, though it can help to refine a scientific theory — as was the case with Einstein’s Special Theory of Relativity, as well as with Newton’s Principia theory of “The System of the World” — is no guarantee of the scientific, empirical applicability or accuracy of the theorems so deduced.  A deduction is no better, in terms of empirical truth, than are its premises, its assumptions, its postulates and axioms, its definitions, its primitives and its rules of inference.  It is perfectly possible to concoct any number of ‘logically rigorous phantasies’, that contradict experimental results and human experience in any number of ways.  Thus, human experience and experimental, instrumental data, extending that experience, remain the crucial test of scientific theories, as opposed to of philosophical speculations.  Deduction is inherently speculative.  Only the disciplining of the postulates to accord with experiential, practical truth [which, as such, is never final, or exempt from supersession] can facilitate scientific theory-making.  and, even then, deduction does not guarantee predictive superiority, and predictive superiority is the arbiter of theory acceptance in science, as distinct from philosophy

Most theories today don’t predict much at all, they have such limited scope and merely fit the data that was already recorded.  The RS fits the data for all science on a general level; it changes the meaning of some fundamental theory, but it doesn’t actually alter the majority of the details, while at the specific points it disagrees with current theory (but not data) it points towards important research that could be done that would add to its predictive superiority.  Only a dozen or so independent researchers have really been working on it, and mostly at a level of generality that is not going to be able to compete in the game of precise predictions at the level of detail that most specialists working with complex instruments at the edge of physics are doing.  There have been successful predictions, but mostly they are playing catch up, as we are all doing with the mountain of research data out there.  But unlike the mainstream, they are finding much of the results coming in from our society’s increasingly sophisticated engineering, makes sense in the RS and flows deductively from the few postulates of one fundamental theory that cover all of nature.
At least that is my characterization which I don’t think of as just one “theory” but an extension of ideas that emerge out of the whole character of 20th century scientific thinking when thought through as a whole and not just in isolated fields and problems.  Increasingly since the 19th century, with Riemann,Poincare , etc. we have been struggling to come to grips with the greater abstract “space” of all possible rules, shapes and transformations.  Increasingly it seems obvious that space and time are not primary but ways of visualizing and serializing change, which at the level of absolute magnitudes is a basic ratio of displacement from unity, or a break in a fundamental symmetry that is prior to any concept of space.  Zero and infinity are byproducts of the reference frames we use, not part of the fundamental ratios.  I agree with you in suspecting that physical space is not a fundamental objective aspect of the cosmos, and that, especially, “infinity” is a concept that accrues to ‘logically-rigourous phantasy’, at best, and not to the description of human experiential/experimental reality.  However, with regard to time, I suspect that Smolin and Unger are right:  I hold that time [i.e., self-change and other-induced change] — but not our conventional measurements of it — constitute a fundamental aspect of physical reality — perhaps the most fundamental.  Perhaps time was the “first born” of the cosmos, the oldest actuality that we have so far even partly fathomed — “dark energy”, as the oft accelerated expansion of apparent physical space, as “pure time”.

Now you are talking rather metaphysically so I will give you my summary thoughts on this idea which follow Deleuze/Delanda’s theory which they base on systems theory, along with my interpretation of the systems-style physics like the RS:

What we render as space and time are constructed out of a cascade of symmetry breaks from a unified continuum.  But unlike the “hylomorphic model” that presupposes logical categories imposed on an inert substance, whether that be matter, space, energy or even time as some primordial substrate or attribute of some fundamental objects or monads, the unified continuum contains its own ordinal difference.  In a way you are right that the idea of seriality must presuppose any manifestation.  But this is logical order not necessarily “time” as we know it.  Ancient chinese philosophy says yin and yang are equiprimordial, but in practice, yang is logically prior to yin, ontologically in the sense that heaven, the creative seed of idea, pure motion and incessant change is prior to the receptive, Earth, the seed idea’s specific manifestation.

But it isn’t that space receives the forms from heaven/mind/time as a passive receptacle.  There is a cascade of symmetry breaks that creates increasingly bound and eventually metric and euclidean spaces each with their own kind of time, all in order to realise and develop the values of the higher topological levels (Klein’s hierarchy).  Delanda emphasizes that this actualization of the virtual is not realization of an essence because the topological singularities that structure all actual processes, are realized divergently.  The qualities that actual things posses don’t resemble their attractors and in fact often hide the “intensive” dynamics that created them under the cover of the the contingent convergences of process in the resulting form (like parallel evolutionary convergence).

This is just my summary of ideas that fall out of the logic that is suggested by the whole paradigm of evidence that is systems thinking.  Categories are useful if the sets are fuzzy and contingent, but one must not reify the abstractions, but understand their genesis historically and topologically.  Things form out of an intensive continuum, not just in some primordial beginning, which assumes linear time, but in every moment, whether it is the birth of elements in stars or the birth of life out of a chemical gradient. That continuum is always a meshwork of abstract relations of pure magnitude, ratios of change that can be given a framework that translates the ratio into space/time or time/space.  This idea inductively derived from general science is roughly the idea that the RS starts with and deductively predicts and explains everything on.

 

There is much more we could discuss but I am not the best at the technical details, which is why I refer to those that are.  If you don’t get around to checking out the video, maybe you will want to discuss my understanding of this stuff in person when we meet up next month.  Yes, I look forward to that opportunity.

I do appreciate your honest feedback and friendship and I hope we can find some common ground in the future.  I, too, appreciate your friendship, and viewpoint, and the fruitful dialogue that we have developed.

I get the feeling you have insights into the meaning of mathematics and its use for modelling the world and can help me make sense out of all this along with others I think are on similar tracks. Larson’s system isn’t some alternative theory starting from scratch with no relation to experiment.  It is the product of thinking through the results of the history of scientific experiment and trying to understand what has been found not in terms of a historical necessity, not taking whatever interpretation was decided upon historically and extrapolating from there, (which seems to be what you do, in which case it is clearly Hegelian metaphysics), but instead finding the structure of what is possible, of the many ways things can develop, of the myriad forms of combination that result from different assumptions and choosing the interpretations that not only fit the data but create a meaningful system.  No human being can escape the ‘human phenome’ that raised them, even though many think they can.  Therefore, my approach is Hegelian in the sense of the method of immanent critique of all received theories [and including the continued immanent critique of my own theories].  My approach is non-Hegelian in that it is non-metaphysical — I do not indulge in ungrounded speculations, but limit myself to forming theories which fit the present qualitative, ontological experience/data about the past.  In this, my approach is similar to that of Karl Marx. My interpretation or solution of the ”’algebraic unknown”’ category symbols that are generated by the self-reflexion, self-operation, self-aufheben, or “self-squaring” of the categorial expression of the present ontological state of history — i.e., the formation of predictions — does involve speculation about the future, but this speculation is speculation grounded in the meaning of the already solved-for category-symbols which describe the past ontological state of history.  Mathematics and science is opportunistic — whatever the mathematical paradigms and physical theories that succeed locally are generally adopted by mathematicians and scientists, without much initial concern about their incongruity vis-a-vis other mathematical axioms and axioms systems and physical theories, adopted for other localities, or about the coherence of the whole theoretical content of mathematics and of physical science.  Philosophy [of science, and of mathematics] can help clean up this mess, if it locates the potentials for congruity and coherence of the totality of present mathematical and scientific theory, without diminishing modeling efficacy and predictive power.

The systems approach to science is what has actually been producing all the new insights and technology while it has been repressed out of physics for interesting reasons.  I’d like to hear more about this from you when we meet for our annual September dinner!  

I do not think relativity qualifies as a system’s understanding because, even if it was an attempt at unity, it is not coherent, that is, it is contradictory at its core.  Not sure what you mean here — General Relativity does lead to division-by-zero singularities at the extremes of mass-energy self-concentration and self-densification, and also when it attempts to describe the Domain presently theorized by Quantum Mechanics.

I am talking about conceptual absurdities.  The math may be consistent but the singularities point to better models.  The more New Agey scene of alternative science is obsessed with the singularities and there are several fads around this point but they only vaguely intuit the truth of scalar motion and the deeper meaning of those ontological transformation points that they and you would like to model.

 

So, while Larson made some successful predictions with his systems approach to physics, specifically with pulsars and quasars that later came true — this should be the most prominent feature of his work, if it is to qualify as science, as practical truthfor the present, and not just as philosophy  one could say his few postulates from which everything derives, predict most of known physics as logical possibilities and extrapolations.

Again, by these standards there would be very little left we could call science.  Most science is done these days as exploratory research, not explicit adherence to the scientific method with formal hypothesis and all that. Feyerabend argues it never was done that way.  It is difficult to generalize across all fields but some of the best data is being generated from the many strategies of data collection that don’t originate with a specific falsifiable hypotheses, and some of the most corrupted data is coming from those with a particular point to make with much pressure to find what they expect to find.  Tactics range from deliberate tampering and obfuscation, to the universal method of mathematical fudging in physics.

With so much research and so many conflicting conclusions being drawn, the most practical and useful  theory is done to model the data gained from research for further analysis and extrapolation.  Besides the mediocre niche predictions of the specialists, the most important theorizing is extrapolation to general and qualitative directions that can open up new avenues of research, like yours and Larson’s; very little of important science fits the falsifiability process that Popper idealized and Feyerabend repeatedly debunked.

 

But to answer your questions more directly:

The standard model of physics and its many offshoots fits data very well because that is what it is designed to do.  And, in my, as well as in the views of many others, rightly so.  If it is not designed to do so, it is not science.

It is not a meaningful theory and has no logical consistency.  I do not see what you mean by “meaningful theory” here.  Newton’s and Einstein’s theories have, by Newton and Einstein themselves, and by others, been formulated in the form of highly rigorous formal-logical theories.

No one is claiming it is all rubbish.  Most of the math is approximately correct because it finds a good fit and that can lead to extrapolations and predictions, sometimes many of them, sometimes successfully, and other times not so much.  But as I said the other day, general relativity is mostly as Freeman Dyson put it famously “totally inert”. Yes it connected a few things, enough for a few successes, but mostly it sits there totally disconnected from useful physics mucking up astronomy and cosmology.  The failure to link General Relativity and Quantum Mechanics is a reflection of the limitations of both theories, as well as of the underdevelopment of mathematical and physical theory in general, at present.  As far as scientific success, the General Theory of Relativity, for the prediction of the dynamics of the macrocosmos, is presently unequaled and unsurpassed.

Both theories are off I agree.  Theory has been underdeveloped, but not so much for lack of effort.  Thousands of PHDs work on it but no one questions much.  They end up in mathematical mazes like string theory.  This is precisely because of a lack of philosophical sophistication of theoretical physicists.  I am reading right now Thomas K Simpson praising Maxwell’s grounding in Scottish metaphysics and Kant.  Einstein also knew his Kant.  Bohr, Bohm, all the greats of the last great revolution knew how to think and were widely read.  Now it is more a field for borderline autistic geniuses or savants that Feynman exemplified.  It is part of our culture’s shift towards hyper manic thinking rather than deep attention and reflection.  Building ever higher castles in the sky, then renormalizing everything like it is all just the way it goes.

 

In contrast Larson’s model connects everything in physics its researchers have found over half a century (after some developmental adjustments) with no need for the arbitrary constants, proliferating assumptions, and fudge factors which have become the standard practice of physics.

Larson’s thought has made successful predictions — that is the key to any claims for itself as science — but its strength — its strength as philosophy — is in consolidating what we know of mainstream science

and in the vast arena it opens up for the future that we are only beginning to understand — that is only a hopeful and, presently, merely speculative claim, not yet substantiated as science.

The contemporary physics scene is a practical dead end.  Billions spent on experiments every year with little to show because they have the wrong lense.  The vast data pouring in from these experiments has little practical value because it has no meaningful context, just heuristics, just proliferating categories, no real hegelian synthesis, no real understanding.   That is an extraordinary claim, which would seem to be falsified by much evidence — e.g., the detection of what we presently call “dark energy” and “dark matter”, of the accelerated self-expansion of the cosmos, of gravitational waves predicted by the General Theory of Relativity, etc., etc.  To substantiate such a claim, one would have to offer a new scientific theory that out-predicted all of the presently existing older scientific theories.  It is not my understanding that Larson has accomplished this.

Again, dark matter was not a discovery and it is not a triumph of theoretical physics, it is a triumph of engineering that brought back data that shows how wrong the theories are, but rather than admitting that they can’t find 95% of the matter the theory predicts they would find, they just postulated that 95% of the matter must be an invisible kind of matter they just discovered!.  No one has discovered anything.  Dark Energy and the accelerated expansion are other examples, all of which are central to Larson’s theory which was laid out in the 1950’s.  Gravity waves are a joke and their supposed discovery is one of the funniest farces in recent physics, at least the way Miles Mathis tells it.  In any case there are phenomenon predicted by the RS that could correlate just as well and with the RS all these things make sense together, whereas none of them make sense together in the standard model of cosmology.  It is a mess as you admit.

 

Meanwhile all the real physics is being done in the black budget world because they have the right understanding.  Right understanding, in science, can be proven only by superior predictive power.  The recognized important experimental results which defied what they thought they knew, and because they changed their theory they were able to redirect experiments towards the important areas that the alternative scene has been busy trying to figure out for themselves. The military stumbled upon things they couldn’t understand and rather than explain it away with ad hoc categories, they tested and revamped their theories until a new world opened up for them.  Can you substantiate this claim?

Predictive power does not mean right understanding.  Again you dodge the whole point of my original message with this repeated statement.  We are in epicycle territory..  Many competing theories can make the same prediction, it doesn’t prove anything.  Especially when the data is already gathered or the thrust of it is predictable on practical engineering levels like so much theory today, it doesn’t validate the theoretical underpinning.  Even Popper would agree.

This should be obvious.

 

As for the military physics, there is much evidence to support that this is what happened. Curiously this is the same time that physics went off the rails.  Follow the rails and they lead underground.

That world is where the action is at and it’s where the reciprocal system points–to what Larson called the cosmic sector, faster than light motion…it is not a metaphysical realm in the sense that it is beyond measurement, beyond physics (motion).  On the contrary, so much of the post-classical (Quantum) experimental data is meaningless without being aware of this reciprocal of normal space/time motion, because it is the very nature of energy (time/space).  But everything is motion, or more precisely, ratio, change; fundamentally change is abstract and gets measured through reference systems.  Metaphysics, which I have no problem with, (it is, like you, what I do a lot of–interpreting the ultimate meaning and trajectory of this motion), is not what I need to discuss with you.  These claims can only be persuasive, scientifically, if the theories described produce models better fit to human experience/experimental data.

They do.

For Larson, who was not much of a metaphysical guy but a shrewd engineer, the cosmic sector and everything in his theory, everything that people have found experimentally, falls out of his basic insights with no need for all the metaphysical entities you believe in — I’m not clear about what beliefs you are attributing to me here — that are necessary to explain a system of forces that doesn’t meaningfully connect – I’m not sure what “doesn’t meaningfully connect” means here — , and that are imagined to exist behind the recorded data — this “behindism” should be accepted only until a scientifically superior theory, requiring less positing of such ”’noumena”’, arrives.  In truth, everything is just distributions of ratios of space to time(motion) and it all falls out of that.  Again, this assertion can be scientifically persuasive only if the theories deduced from this assertion lead to scientifically superior — predictively and ‘retrodictively’ superior — theories.  Otherwise, we are back to metaphysical speculation and dogmatic assertion all over again — back to the “Middle Ages”.

I was attributing to you the belief in things like “dark matter”, which is by all admitted to be a name for something they don’t understand. It is little more than a placeholder.  I understand you think that you only accept theories provisionally, until a better model has been demonstrated in experiment.  What you have to understand is that all the models in physics have been breaking down for decades in face of new data.  The experimental technology is way beyond the theory and the theory is so broken that even the fudges and ad hoc theories aren’t working.  You can’t buy the hype of the main science news.  Plenty of physicists are speaking out and even mainstream news stories are coming out with pieces on the corruption of science research in general.  People like Miles Mathis are documenting much of the problems in physics and have been offering more coherent models for some time, but there are plenty of dissenters in the establishment too that still use some of the old theory even as they point in new directions.  Better alternatives abound.  What the smart dissenters are doing is looking at the system of theory itself for the truth rather than taking the theoretical objects in the theory for granted as truth.  This more abstract approach is having more success at the highest levels of mainstream theory as the ideas of particles and forces are becoming more and more just abstract symmetry relations.  It’s all coming back to topology.  It’s all about the structure of differential relations.  This is no absolute truth either but it gives us a more general picture of the phenomenon we are dealing with as opposed to the often misleading generalizations and predictions coming from models based on specific objects and their supposed qualities.

For example, what Delanda is at pains in his books to show is how often the qualities we ascribe to entities as being their essence in relation to some category are exactly not essential to their meaning and history.  Yes a certain chemical element has a certain number of electrons but what is important for real understanding and prediction is precisely the things that change about an entity, its differential structure that emerges from the possibility space, in this case the space of possible atomic structures, isotopic variants that determine transmutation potentials and patterns between elements that predict the patterns of chemical synthesis. More of this in the next email….

—————————————————————

*and this parallel thread with the same individual:

I enjoyed talking to you today.  I think you are on the right track with physics but I would love to point out how and why you might want to eschew some of the discipline’s fundamental assumptions.  I think starting from different principles might fit some of your other ideas better, though it might challenge some of your conclusions and the work you have done.

You were really close today at intuiting why what Larson calls scalar motion is so important. Scalar motion is Larson’s main building-block-concept that he derives “directly” from observation.  His best intro book looks just at this concept and its base in neglected facts: https://reciprocalsystem.com/nfs/index.htm
I also wanted to make clear my stance on essentialism: that it is only marginally better to replace essential things with essential processes.  To give an example in my main practical field, shifting from blaming a virus for a disease to blaming a disease process that the virus name merely refers to just makes the fundamental essentialism more dynamic and elusive but maintains the scapegoat logic of traditional categorical thinking.  A disease is much more effectively analyzed an attractor in phase space.  It isn’t caused by a pathogen because pathogens are harmless in different environmental and psychological contexts.  What makes something what it is depends on everything else related to it, therefore it is not useful and often harmful to think in essentialist ways.  This has been one of the major themes of the last half century in theory.
Instead of essences, Deleuze uses the word “multiplicity” from mathematics to refer to entities’ essential structure in the space of possibilities.  This structure is not clear and distinct like traditional essence because it depends on context and relations that can never be exhausted by reference systems, but I don’t think this precludes using some version of categories and sets as you do.  It just acknowledges there is no transcendent space of identity or presence since all categories/ideas/ multiplicities and their actualized assemblages are in a dependent relationship with each other.  Time is not some agent of change on a static background of forms building novelty in law-like regularity.  There are no independent entities like space, time, electron, force, energy, etc.. or us for that matter; we all derive from everything and in turn create everything else.  There is no privileged position in the topological continuum.  Breaking that foundational and very immanent symmetry creates position and the affine geometry of scalar expansion/ contraction.  Dark matter and energy need not be abstract unknowns, they are obvious ordinal consequences of the topological continuum as you break its symmetry with scalar assumptions about reference points.
I think any immanent understanding of physics has to eschew the assumption of a transcendent spatial dimension acting as container for the others.  Larson’s physics anticipated what has emerged in the contemporary complexity paradigm in its role as a general theory.  The more I understand of your thinking, the more it interests me as a development of aspects of modernist thought as it tried to incorporate change and difference but did not fundamentally question representational logic–or in the sciences, with the struggles of traditional logic-based modelling of intelligence that eventually led to the very post-modern approach of connectionism and its progeny in contemporary AI research.
 In any case, essence and identity are displaced by systems of differences, not– as you seem to be saying– in a merely “present” but changing ontological whole, but in a truly immanent universe in the sense that no entity is ever transcendentally outside the text of the universe, being merely “past” or present and available for neutral representation.  I do not deny the reality of the electron for instance, but its reality is quite a diffuse system of differences lacking much coherence or use at this point.  I prefer other systems of ideas like I have mentioned to you that make what we call an electron much more coherent and understandable than the belabored attempts to maintain a particle concept dealing with a phenomenon that should have thrown this whole paradigm of being as presence out of the game a century ago.

Hi Adam,

At last, I can get to this:  responses embedded below.

Regards,

 

—–Original Message—–
From: Adam Pogioli <satyavan8@gmail.com>Sent: Tue, Jun 20, 2017 1:25 am
Subject: follow up

I enjoyed talking to you today.  As did I to you.

I think you are on the right track with physics but I would love to point out how and why you might want to eschew some of the discipline’s fundamental assumptions.  I think starting from different principles might fit some of your other ideas better, though it might challenge some of your conclusions and the work you have done.  Continual immanent self-critique, as well as the study of the works of others, is a continual self-challenge, and I welcome it:  it is the very engine of my progress.  But, of course, I have to set my own schedule, and use my own judgment as to how best to allocate my time among these many, many self-challenging tasks.

 

You were really close today at intuiting why what Larson calls scalar motion is so important. Scalar motion is Larson’s main building-block-concept that he derives “directly” from observation.  His best intro book looks just at this concept and its base in neglected facts: https://reciprocalsystem.com/nfs/index.htm

I also wanted to make clear my stance on essentialism: that it is only marginally better to replace essential things with essential processes.

I do not subscribe to essentialism at all, either as an essentialism of what may be conceived as ‘nounic’ things, or as an essentialism of what may be conceived as ‘verbic’ processes, or even as an essentialism of what may be conceived as ‘eventities’ — as a synthesis of the two.

Essentialism tends to posit a strictly local inherence, an ubiety, as determining the characteristics constituting the nature of a locally-manifest entity, rather than an ”’ubiquity”’, by which any locally-manifest entity is determined in its nature by its interaction with the totality — with all other, including ‘otherwhere-local’, entities, but also by its interaction with itself, because it itself is also a part of that totality.  This is the Bohmian ”’holonomic/holographic”’ principle, which is already embodied, to a degree, in Einstein’s General Theory of Relativity, his theory of the Universal Gravitic Field, and, to a lesser degree, even in Maxwell’s Theory of the Universal Electromagnetic Field.

I would describe my view as ‘compositionalism’, or ‘constitutionalism’ [though the latter term has unwanted constitutional-law connotations], recognizing the ‘”emergent properties”‘, ’emergent qualities’, and emergent new ontology — the irruption of new kinds of being — that results thereby.

When a field of monads which are exhibiting their given defining qualities becomes, for the first time in universal history, populous enough and concentrated enough to ‘compose with themselves’, to irrupt combinations of themselves — new, ‘meta-monads’ — these ‘meta-monads’ inherit aspects of the qualities of the monads that ”’composed”’ them, but, in addition, new, ’emergent qualities’ also become manifest for the first time.

When atoms, in a pre-molecular, interstellar ‘atomic cloud’ self-densify sufficiently to turn this cloud into a “molecular cloud”, the ‘meta-monads’ of the ‘atom-monads’, namely, the ‘molecule-monads’, inherit and reproduce aspects of the behaviors of their constituent atoms.  But they also manifest unprecendented behaviors and qualities, not manifest in atoms outside of their bonding into molecules.

To give an example in my main practical field, shifting from blaming a virus for a disease to blaming a disease process that the virus name merely refers to just makes the fundamental essentialism more dynamic and elusive but maintains the scapegoat logic of traditional categorical thinking.  A disease is much more effectively analyzed as an attractor in phase space.  It isn’t caused by a pathogen because pathogens are harmless in different environmental and psychological contexts.  What I write about should be conceived of as ‘meta-attractors’ of ‘meta-super-systems’ in ‘state/control meta-space’undaunted by division-by-zero singularities.

What makes something what it is depends on everything else related to it,  PRECISELY!!!   

therefore it is not useful and often harmful to think in essentialist ways.  Einstein’s General Theory of Relativity embodies precisely this ”’holonomic/holographic”’ [cf. Bohm] principle, as I elaborated in another recent e-mail message to you.  It’s too bad that you dismiss Einstein’s work, and, moreover, for non-scientific reasons!   

This has been one of the major themes of the last half century in theory.  That fact may be a sign of the experiential/experimental correctness of this theme, or it may not.  Just consider the thematic trends typical of the ‘human phenome’ of the Dark/Middle Ages in scholastic, Aristotelian scholarship.  Popularity does not necessarily guarantee scientific validity.

If it is a scientific theme, it must prove itself in both the laboratory, ”’in vitro”’, context, and also in the ”’in vivo”’, ‘open Nature’ context.  Maxwell’s Electromagnetic Field Theory and Einstein’s Gravitational Field Theory have done so — inspades!!! — for the aspects of reality which they address.  To my knowledge, Larson’s “scalar motion” theory has not.

Instead of essences, Deleuze uses the word “multiplicity” from mathematics to refer to entities’ essential structure in the space of possibilities.  An <<arithmos>>, i.e., a kind-of-being category, is precisely a multiplicity that is treated, in part because of the limitations of human minds’ capabilities to track multitudes of monads, as a unit or a unity.

This structure is not clear and distinct like traditional essence because it depends on context and relations that can never be exhausted by reference systems, but I don’t think this precludes using some version of categories and sets as you do.  I agree.

It just acknowledges there is no transcendent space of identity or presence since all categories/ideas/ multiplicities and their actualized assemblages are in a dependent relationship with each other.  Agreed.

Time is not some agent of change on a static background of forms building novelty in law-like regularity.
The prevailing concept of, reified, “Time” to which you refer here is a classic if usually unrecognized case of the ideological inversion of real subjects/agents and real objects of which Marx warned.  I view time as the resultant, not the cause, of change — as the ensemble, the chorus, the symphony, of the self-activity and other-activity of all monads; as the ‘<<rhythmos>>’ of the dialectic.  Dialectic is the cause of time.  ‘Self-duality’ is the cause of dialectic.

There are no independent entities like space, time, electron, force, energy, etc., or us for that matter; we all derive from everything and in turn create everything else.  There is no privileged position in the topological continuum.  I believe that this approach is one-sided, and impractical, and risks the error of which Hegel warned, under the name ”’that dark, dark night in which all cows are black”’.  As I see it, the totality is real, but also its self-induced self-differentiations are real, though never absolutely separate, or radically dual or plural.  The cosmos, I believe, is more like a developing, self-differentiating zygotic embryo, than like either a gas of Democratean atoms, or a Parmenidean/Advaitist undifferentiated, homogeneous, eternally static blob of “PURE BEING”

Breaking that foundational and very immanent symmetry creates position and the affine geometry of scalar expansion/contraction.  Dark matter and energy need not be abstract unknowns, they are obvious ordinal consequences of the topological continuum as you break its symmetry with scalar assumptions about reference points.  I think that you’ll have to unpack this for me for it to have any chance of becoming persuasive to me.  I am inclined to belief that Asymmetry, not symmetry, is the key to the creative crises of the <<kosmos>>.

I think any immanent understanding of physics has to eschew the assumption of a transcendent spatial dimension acting as container for the others.  So does Lee Smolin.  So do I.

Larson’s physics anticipated what has emerged in the contemporary complexity paradigm in its role as a general theory.  Can you demonstrate this?

The more I understand of your thinking, the more it interests me as a development of aspects of modernist thought as it tried to incorporate change and difference but did not fundamentally question representational logic–or in the sciences, with the struggles of traditional logic-based modelling of intelligence that eventually led to the very post-modern approach of connectionism and its progeny in contemporary AI research.  I see my work as an immanent critique of ‘The Modern Ideology’ — the Capital-mentalite’, the ‘human phenome’ of the capitalist epoch — entire It is thus both and <<aufheben>> negation and an <<aufheben>> extention of that which it immanently critiques.  Consider Marx’s critique of the ideology-vitiated science of capitalist “classical political economy”.  If you and/or your sources have “fundamentally questioned” what you mean by “representational logic”, what has been the practical fruition of this questioning.  What do you mean by “post-modern”.  Most of what I hear characterized as “post-modern” are pro-decadence, pro-contracted-social-reproduction ideologies, engineered, funded, and spun by the Rockefeller plutocracy to assist in its effort to reverse the historic growth of the social self-force of social self-reproduction, that growth being the key to human progress and to human liberation, but being therefore also a social self-force that, if it continues to grow, will eventually overthrow capitalist ruling-class rule, much to the chagrin of the Rockefeller plutocracy.

In any case, essence and identity are displaced by systems of differences, not– as you seem to be saying– in a merely “present” but changing ontological whole, but in a truly immanent universe in the sense that no entity is ever transcendentally outside the text of the universe, being merely “past” or present and available for neutral representation.  I see the ontology of all pasts as being <<aufheben>>-contained inside every new-born present, and I see every present ‘cumulum’, as represented by a ‘qualitative, non-amalgamative sum’ of ontological categories, as, precisely, a ”’system of differences”, or even as a ‘system of conserved oppositions.  But I also see past, present, and future as intermutually qualitatively different, and real, and I reject all of the mysticisms of “eternal timelessness”, as being contra-experiential.

I do not deny the reality of the electron for instance, but its reality is quite a diffuse system of differences lacking much coherence oruse at this point.  I would have thought that the vast reliability and practical success of electronic circuitry, ubiquitous in our time and clime, and all based upon the theory of the electron, would have demonstrated otherwise to your satisfaction! 

I prefer other systems of ideas like I have mentioned to you that make what we call an electron much more coherent and understandable than the belabored attempts to maintain a particle concept dealing with a phenomenon that should have thrown this whole paradigm of being as presence out of the game a century ago.  I agree that “point-particle” models are deficient — as do many physicists today — but I do not understand your objection to “being as presence”, unless it is due to a believe that all pasts are simultaneously “present” in some timeless, transcendental, metaphysical — mystical — way.

Preference for systems of ideas is a philosophical inclination, or a matter of personal taste.  Scientists, as such, are duty-bound — however much this duty may be honored more in the breach — to uphold the “systems of ideas” that have — almost certainly only temporarily — won the consensus of the scientific community, supposedly because of their greater descriptive/reconstructive/predictive efficacy, until or unless they can produce a theory/system of tested ideas that demonstrates greater descriptive/reconstructive/predictive efficacy than the [until-then] prevailing system of tested ideas.
By what tests could your preferred views be corroborated, or, even more importantly, falsified?

 

Adam Pogioli <satyavan8@gmail.com>

8/25/17

Responses below in green where appropriate and if I haven’t already addressed in previous email :

 
 

On Fri, Aug 18, 2017 at 1:47 AM, <profitsci@aol.com> wrote:

Hi Adam,

At last, I can get to this:  responses embedded below.

Regards,

—–Original Message—–
From: Adam Pogioli <satyavan8@gmail.com>
Sent: Tue, Jun 20, 2017 1:25 am
Subject: follow up

I enjoyed talking to you today.  As did I to you.

I think you are on the right track with physics but I would love to point out how and why you might want to eschew some of the discipline’s fundamental assumptions.  I think starting from different principles might fit some of your other ideas better, though it might challenge some of your conclusions and the work you have done.  Continual immanent self-critique, as well as the study of the works of others, is a continual self-challenge, and I welcome it:  it is the very engine of my progress.  But, of course, I have to set my own schedule, and use my own judgment as to how best to allocate my time among these many, many self-challenging tasks.

You were really close today at intuiting why what Larson calls scalar motion is so important. Scalar motion is Larson’s main building-block-concept that he derives “directly” from observation.  His best intro book looks just at this concept and its base in neglected facts: https://reciprocalsystem.com/nfs/index.htm

I also wanted to make clear my stance on essentialism: that it is only marginally better to replace essential things with essential processes.

I do not subscribe to essentialism at all, either as an essentialism of what may be conceived as ‘nounic’ things, or as an essentialism of what may be conceived as ‘verbic’ processes, or even as an essentialism of what may be conceived as ‘eventities’ — as a synthesis of the two.

Essentialism tends to posit a strictly local inherence, an ubiety, as determining the characteristics constituting the nature of a locally-manifest entity, rather than an ”’ubiquity”’, by which any locally-manifest entity is determined in its nature by its interaction with the totality — with all other, including ‘otherwhere-local’, entities, but also by its interaction with itself, because it itself is also a part of that totality.  This is the Bohmian ”’holonomic/holographic”’ principle, which is already embodied, to a degree, in Einstein’s General Theory of Relativity, his theory of the Universal Gravitic Field, and, to a lesser degree, even in Maxwell’s Theory of the Universal Electromagnetic Field.

 

There is a difference between the rather generic notion of objects affecting each other at a distance or together creating a field that conditions or constitutes an entity, and the concept of multiplicity that Deleuze gets from what Bergson and Husserl developed from Riemann.  Again, I think you realize the need to get beyond essentialism and want a truly immanent theory, but what these guys are saying is that dialectical philosophy, by merely opposing and combining the one and the many, fail to produce the immanent structure of possibilities that constitute an entity in its differential structure.  The categories created by dialectics are much too general and abstracted from identities reified from other abstractions. The concept of multiplicity helps us instead evaluate each thing as an accumulation of capacities–not a static and definable list of qualities, but something with a structure of possibilities where we must be aware and justify the choice of parameters that define this structure. Unlike Derrida who highlights this core of choice and undecidability,  Deleuze has definite suggestions for modelling things along possibilities that highlight their potential for new connections, which is why he tends to put down the “arborescent” structure of dialectic in favor of the “rhizome” pattern that creates divergent spaces of actualization.
 
 I don’t think the contrast between identity/presence and difference/multiplicity is as black and white as he makes it, and think dialectic has its place. But we can see the limitations of this approach of placing things within a singular organizing space with general relativity, which in my opinion stops short of the radicalization of space that Deleuze sees in Riemann.  The result is we still have a global embedding space, even if with Einstein it is no longer Euclidean, even if, as you say, the object can affect itself, you still have space as a container or universal medium.  Instead I am suggesting space emerges in progressive symmetry breaks through a virtual possibility space that imposes a reference system on a unified but heterogeneous continuum, thereby creating spatial and temporal relationships that structure any entity through its unique history and differential relations in possibility space.  The RS gets specific with the physics of this idea, as it deduces all the ways abstract ratios at the projective layer of manifestation can get distributed through different reference systems depending on the level of symmetry and the combinations of motion. These combinations can create very different speed ranges which drastically affect the manifestation of time–this makes a big difference in astronomy and lead to Larson’s correct predictions concerning pulsars and quasars since only in his theory are these things understood with their necessary involvement with high speed motion (faster than light through motion combinations) .
 

I would describe my view as ‘compositionalism’, or ‘constitutionalism’ [though the latter term has unwanted constitutional-law connotations], recognizing the ‘”emergent properties”‘, ’emergent qualities’, and emergent new ontology — the irruption of new kinds of being — that results thereby.

When a field of monads which are exhibiting their given defining qualities becomes, for the first time in universal history, populous enough and concentrated enough to ‘compose with themselves’, to irrupt combinations of themselves — new, ‘meta-monads’ — these ‘meta-monads’ inherit aspects of the qualities of the monads that ”’composed”’ them, but, in addition, new, ’emergent qualities’ also become manifest for the first time.

When atoms, in a pre-molecular, interstellar ‘atomic cloud’ self-densify sufficiently to turn this cloud into a “molecular cloud”, the ‘meta-monads’ of the ‘atom-monads’, namely, the ‘molecule-monads’, inherit and reproduce aspects of the behaviors of their constituent atoms.  But they also manifest unprecendented behaviors and qualities, not manifest in atoms outside of their bonding into molecules.

To give an example in my main practical field, shifting from blaming a virus for a disease to blaming a disease process that the virus name merely refers to just makes the fundamental essentialism more dynamic and elusive but maintains the scapegoat logic of traditional categorical thinking.  A disease is much more effectively analyzed as an attractor in phase space.  It isn’t caused by a pathogen because pathogens are harmless in different environmental and psychological contexts.  What I write about should be conceived of as ‘meta-attractors’ of ‘meta-super-systems’ in ‘state/control meta-space’undaunted by division-by-zero singularities.  

What makes something what it is depends on everything else related to it,  PRECISELY!!!   

therefore it is not useful and often harmful to think in essentialist ways.  Einstein’s General Theory of Relativity embodies precisely this ”’holonomic/holographic”’ [cf. Bohm] principle, as I elaborated in another recent e-mail message to you.  It’s too bad that you dismiss Einstein’s work, and, moreover, for non-scientific reasons!  

 
I don’t dismiss anyone’s work.  He was a brilliant man.  I just think some other ideas clear up and illuminate what he was trying to understand.  His transformation into dogma spurs some rather necessary countermeasures however.  Moreover, you don’t know enough of why I disagree with him to claim I am being “unscientific” with my opinion.  We have barely scratched the surface with our talk of his work, which has many aspects beyond general relativity, as you know.  I send you links when you seem interested because these things demand a nuanced discussion that would take me days and merely repeat what people like Larson have already written and which is available in complete form on the internet.  But you haven’t given me a sign you have taken seriously any of these suggestions or analyses.  Pronouncing me unscientific without looking at the arguments is unscientific.  I may sound dismissive sometimes about relativity but I seldom have time for a more nuanced discussion standing in Sundance, and with these letters already very involved I am not going to explain Larson’s  excellent gravitational theory in a single letter, unless you really want to look at his work and discuss certain points with me.  Einstein was on the right track in some ways but got lost in the math instead of making the jump from the equivalence principle and mach’s principle to the beautiful idea of scalar motion which is a much more logical induction from the data.
 
  I completely understand your skepticism about alternative theories;  I feel the same way about any theory.  I am only interested in reality, not speculation.  If a theory isn’t based on clear principles that can be validated in my experience, I find it is better to find clearer principles to tease out what was being attempted. There is truth in everything if you can find its thread.  Often the truth is very different than its appearance, but it is there when you have enough context.
 
 Physics will find its way back to the truth with its increasing interest in gauge and symmetry theories that are digging behind the spatial field theories into the underlying abstract motions in symmetry space.
 
 
This has been one of the major themes of the last half century in theory.  That fact may be a sign of the experiential/experimental correctness of this theme, or it may not.  Just consider the thematic trends typical of the ‘human phenome’ of the Dark/Middle Ages in scholastic, Aristotelian scholarship.  Popularity does not necessarily guarantee scientific validity.
Nothing guarantees validity.  The fact that what passes for hard science has kept reductive models and distorted reality into meaningless niches doesn’t indeed mean they are wrong, but in this case the trend in Theory, where people are actually thinking through the consequences of ideas and finding ways of modelling that liberate, connect and open a wondrous space for new ideas and exploration, suggests it is a promising path.  All you are doing is siding with the mainstream trend.  We are all working with the same data.  Even within the mainstream there are always many alternative ways of modelling, especially in complex fields.  It isn’t just fashion that decides competing theories that both fit the data.  Numerous forces are at work in every field.  I am merely advocating for an alternative thread that has penetrated the mainstream more in fields where cooperation and interdisciplinary knowledge can guide us.  This was Feyerabend’s suggestion as well.  Not to collapse field boundaries, but let the connections between fields guide are evaluation of theory.  Science studies as a field is an important bulwark against the ignorance many seem to display when it comes to process of peer review and scientific consensus.  
 
 

If it is a scientific theme, it must prove itself in both the laboratory, ”’in vitro”’, context, and also in the ”’in vivo”’, ‘open Nature’ context
.  Maxwell’s Electromagnetic Field Theory and Einstein’s Gravitational Field Theory have done so — inspades!!! — for the aspects of reality which they address.  To my knowledge, Larson’s “scalar motion” theory has not
You obviously have not looked at Larson’s theory of scalar motion.  It is a different way of thinking of the evidence.  It could have replaced general relativity if it had been done in the fashionable way.  Instead physics will probably work its way back to ideas like his and Kozyrev’s either through leaks from the classified physics or through eventual convergence in biophysics and other interdisciplinary work where the gauge symmetry is known to be altered or tuned as it is with life and consciousness.
 

Instead of essences, Deleuze uses the word “multiplicity” from mathematics to refer to entities’ essential structure in the space of possibilities.  An <<arithmos>>, i.e., a kind-of-being category, is precisely a multiplicity that is treated, in part because of the limitations of human minds’ capabilities to track multitudes of monads, as a unit or a unity.

The key point here is that everything is a unique individual, especially more complex individuals like humans or a society;  we are all singular and unique.  That isn’t to say devoid of structure, but everything derives its structure in ways that are part of other entities not just in some diffuse external way based on external qualities or placement in extension space, but rather placement in space and time are ways of representing that differential structure itself so our categories tend to obfuscate the inner space of possibilities that define an individual by thinking about it as a member of a class of entities bound by their sameness.  Difference is older than Being as Derrida was fond of saying. Or Being is that differential continuum which Heidegger seemed to have meant.  Deleuze says Multiplicity=univocity.

This structure is not clear and distinct like traditional essence because it depends on context and relations that can never be exhausted by reference systems, but I don’t think this precludes using some version of categories and sets as you do.  I agree.

It just acknowledges there is no transcendent space of identity or presence since all categories/ideas/ multiplicities and their actualized assemblages are in a dependent relationship with each other.  Agreed.

Time is not some agent of change on a static background of forms building novelty in law-like regularity.
The prevailing concept of, reified, “Time” to which you refer here is a classic if usually unrecognized case of the ideological inversion of real subjects/agents and real objects of which Marx warned.  I view time as the resultant, not the cause, of change — as the ensemble, the chorus, the symphony, of the self-activity and other-activity of all monads; as the ‘<<rhythmos>>’ of the dialectic.  Dialectic is the cause of time.  ‘Self-duality’ is the cause of dialectic.

 
sounds about right :)
 
 

There are no independent entities like space, time, electron, force, energy, etc., or us for that matter; we all derive from everything and in turn create everything else.  There is no privileged position in the topological continuum.  I believe that this approach is one-sided, and impractical, and risks the error of which Hegel warned, under the name ”’that dark, dark night in which all cows are black”’.  As I see it, the totality is real, but also its self-induced self-differentiations are real, though never absolutely separate, or radically dual or plural.  The cosmos, I believe, is more like a developing, self-differentiating zygotic embryo, than like either a gas of Democratean atoms, or a Parmenidean/Advaitist undifferentiated, homogeneous, eternally static blob of “PURE BEING”

I should have said this better.  It is not the totality to which I am referring.  I don’t think totality is real at all.  Unity is in everything but their is no collection of entities that exist in or as part of some transcendental whole.  Infinity is merely a reference horizon necessary to build reference frames as you descend down through the projective stratum of geometry (the line at infinity and its geometric dual the point at zero)
 
I am not collapsing the dialectical tension of one and many back to the one, I am saying we are each that one.  We are not parts of a single whole or a singular substance, we are one being unfolding the infinite possibilities implied within us.
Breaking that foundational and very immanent symmetry creates position and the affine geometry of scalar expansion/contraction.  Dark matter and energy need not be abstract unknowns, they are obvious ordinal consequences of the topological continuum as you break its symmetry with scalar assumptions about reference points.  I think that you’ll have to unpack this for me for it to have any chance of becoming persuasive to me.  I am inclined to belief that Asymmetry, not symmetry, is the key to the creative crises of the <<kosmos>>.
The topological continuum is very limited by its extreme symmetry, so yes the introduction of projective, affine, metric, euclidean stratas introduce increasing breaks in symmetry and make possible the richness of form we see here.  But even at the most primordial/complex layers of Being, there is always difference, asymmetry as you say, which unfolds as Being, as difference, as those ordinal series become embodied as motion in space, of which the affine layer, which begins the assumption of in/out in what Larson calls “scalar motion” is a natural consequence. 
 
 The basic scalar motion of what he calls the “outward expansion of the natural reference frame” which happens at the speed of light, is only viewed as a motion from the stationary reference frame in the space we inhabit of low speed motion because we experience the inward scalar motion of gravity which, like all forces are products of motion not the cause.  At the level of unity, light speed, the ratio is 1/1 so it isn’t really motion except in combination with other motions that are inherent within the possibilities of a unity which not a static blob of symmetry but a ratio of unity that can create worlds of realization within the reference frames wedged between zero and 1 and 1 and infinity.  
 

I think any immanent understanding of physics has to eschew the assumption of a transcendent spatial dimension acting as container for the others.  So does Lee Smolin.  So do I.

Larson’s physics anticipated what has emerged in the contemporary complexity paradigm in its role as a general theory.  Can you demonstrate this?

I hope I have given you a taste for why I think this.  We can discuss more later. 
The more I understand of your thinking, the more it interests me as a development of aspects of modernist thought as it tried to incorporate change and difference but did not fundamentally question representational logic–or in the sciences, with the struggles of traditional logic-based modelling of intelligence that eventually led to the very post-modern approach of connectionism and its progeny in contemporary AI research.  I see my work as an immanent critique of ‘The Modern Ideology’ — the Capital-mentalite’, the ‘human phenome’ of the capitalist epoch — entire It is thus both and <<aufheben>> negation and an <<aufheben>> extention of that which it immanently critiques.  Consider Marx’s critique of the ideology-vitiated science of capitalist “classical political economy”.  If you and/or your sources have “fundamentally questioned” what you mean by “representational logic”, what has been the practical fruition of this questioning.  What do you mean by “post-modern”.  Most of what I hear characterized as “post-modern” are pro-decadence, pro-contracted-social-reproduction ideologies, engineered, funded, and spun by the Rockefeller plutocracy to assist in its effort to reverse the historic growth of the social self-force of social self-reproduction, that growth being the key to human progress and to human liberation, but being therefore also a social self-force that, if it continues to grow, will eventually overthrow capitalist ruling-class rule, much to the chagrin of the Rockefeller plutocracy.
Post-modern to me means complex… defying representation in a single metanarrative. Postmodern theory didn’t create the decadence or complexity of our postmodern society, they have just responded to it.  At its best in the complexity paradigm, truth, metaphysics, ontology, metanarratives, whatever you want to call them or it, aren’t abandoned as much as they are used with awareness of the dangers and contingencies. The practical fruition is the evolving maturity of our ideas that are becoming more adapted to dealing with difference and diversity, to truly appreciating each individual without homogenizing all their complexity, without undermining them down to their parts or overmining them up to some generic category.
 
In any case, essence and identity are displaced by systems of differences, not– as you seem to be saying– in a merely “present” but changing ontological whole, but in a truly immanent universe in the sense that no entity is ever transcendentally outside the text of the universe, being merely “past” or present and available for neutral representation.  I see the ontology of all pasts as being <<aufheben>>-contained inside every new-born present, and I see every present ‘cumulum’, as represented by a ‘qualitative, non-amalgamative sum’ of ontological categories, as, precisely, a ”’system of differences”, or even as a ‘system of conserved oppositions.  But I also see past, present, and future as intermutually qualitatively different, and real, and I reject all of the mysticisms of “eternal timelessness”, as being contra-experiential.
Agreed.  I would just add that again it is the possibility space that determines the structure so it isn’t just a tension between conserved oppositions stored from the past but between possibilities, not just of the future, but of what history is actualized as well.  There is no one past, since it is constructed in the present by beings along with their future as they navigate the space of possibility.
I do not deny the reality of the electron for instance, but its reality is quite a diffuse system of differences lacking much coherence or use at this point.  I would have thought that the vast reliability and practical success of electronic circuitry, ubiquitous in our time and clime, and all based upon the theory of the electron, would have demonstrated otherwise to your satisfaction! 
You are just as skeptical of quantum theory as I am if not more, even though you know it works well.  Why keep insisting that functionality implies impeccability? Quantum theory uses mathematics that work and can model what happens in atomic systems well enough, but the nuclear model of the atom was scarcely around very long before it became just a vague metaphor, before the reality of the electron became a probability cloud.  Not that probability space isn’t real, but conflating the probability of interaction of certain motions with some particle orbiting the nucleus is wrong.  They still teach the visual/conceptual model to high school kids but it hasn’t been more than a metaphor for a century. 
 

I prefer other systems of ideas like I have mentioned to you that make what we call an electron much more coherent and understandable than the belabored attempts to maintain a particle concept dealing with a phenomenon that should have thrown this whole paradigm of being as presence out of the game a century ago.  I agree that “point-particle” models are deficient — as do many physicists today — but I do not understand your objection to “being as presence”, unless it is due to a believe that all pasts are simultaneously “present” in some timeless, transcendental, metaphysical — mystical — way.

No. It is a rigorously developed concept that wraps through and practically defines the last century of Theory.  I can’t do it justice in a few words, but it wraps through the themes of these emails as it is.  Basically a thing is made out of difference, it is not self same.  You might say in you terms that it has self duality, but that dialectical tension is not between any two things that are ever completely present either.  For Derrida Being is Differance with an”a”which he coined because it isn’t just another spatial concept of difference but also temporal; things don’t just differ from themselves they defer themselves.  Everything is within a context and we cannot take any context for granted as being the right one.  We always have to justify our selection.  In the context of this discussion, drilling down to the bedrock of reality only gets us smaller and smaller mirrors of our chosen framework.  It won’t show us the larger context in which these things are appearing, just more debris from our assumption that things are made out of stuff.  Larson’s assumption is that everything comes into being through motions not of any thing but through the absolute magnitudes of differential relations that can be fed through reference frames that change motion from an abstract cross-ratio tied to the the unity of lightspeed as a natural reference frame, into vector motion and patterns of motion that create the objects we see.
Preference for systems of ideas is a philosophical inclination, or a matter of personal taste. 
Everyone has their reasons for why they think the way they do.  Some people are more aware of those reasons.  Some people have better reasons.  You seem to think you are above the personal but we are all on the same ground.  I have my experience, and I listen to as much of other people’s experience and try to understand the world that would produce all the experiences I have and am exposed to.  My preferences reflect the ideas that explain more of the world in a better way.  
Scientists, as such, are duty-bound — however much this duty may be honored more in the breach — to uphold the “systems of ideas” that have — almost certainly only temporarily — won the consensus of the scientific community, supposedly because of their greater descriptive/reconstructive/predictive efficacy, until or unless they can produce a theory/system of tested ideas that demonstrates greater descriptive/reconstructive/predictive efficacy than the [until-then] prevailing system of tested ideas.
By what tests could your preferred views be corroborated, or, even more importantly, falsified
My prefered views are supported by more evidence.  The mainstream ideas are falsified all the time.  They just push the data till it fits.  They don’t deserve your allegiance.  I understand not wanting to accept something without evidence but a huge amount of mainstream physics is unsettled.  Even the oldest layers of the theory get altered by new discoveries and ideas, so nothing is ever really settled.  There always can be better ways to model something.  There is no duty to uphold anything.  The true scientist should be exploring not defending dogma.  

Comments

comments