NOTES ON: Delanda, M.(2002)
Intensive
Science and Virtual Philosophy, London:
Continuum
[This is Deleuze for scientists,
and also for those readers who want to understand
some of the mathematical and scientific background
[spattered throughout Difference
and Repetition]—explained very well on
Delanda’s videos
on Deleuze.
It has also been hailed as a philosophical
account of complexity theory. It is a
challenging read for sociologists, but I
much prefer it to the literary and subjective
interpretations, especially those found in
educational commentaries. Underlinings are mine,
designed to help me look through quickly]
This is to be a ‘reconstruction
of [Deleuze's] philosophy, using entirely
different theoretical resources and lines of
argument’ (4).This is because the general issue of
changing theoretical assumptions and developing a
realist ontology needs to be addressed—and
arguments are found often in a very condensed way
in Deleuze.It
is necessary to discuss not ‘Deleuze’s words
[but]…Deleuze’s
world’ (5).It
is not a comprehensive reconstruction.The main
theme will be to replace essentialism with a
notion of ‘dynamical processes’ of various kinds.Deleuze
himself often offers only a ‘compressed’ account
of these issues, one which ‘assumes so much on the
part of the reader, that it is bound to be
misinterpreted’ (5).
Deleuze conceives of difference
as a positive productive force which drives a
process, especially intensive differences such as
differences ‘in temperature, pressure, speed,
chemical concentration’ (6).Epistemological/ontological
issues ensue, but the most important thing for
Deleuze is to correctly pose relevant problems,
especially to focus on ‘the singular and the
ordinary’ (7).
Although this reconstruction
must be adequate, ‘There is a certain violence
which Deleuze’s texts must endure in order to be
reconstructed for an audience they were not
intended for’ (8).In particular, it is necessary to avoid
premature solidification to what is intended to be
a fluid and open text.
Chapter one.The
mathematics of the virtual: manifolds, vector
fields and transformation groups
[There is an excellent summary of the
themes of this chapter in the beginning of chapter
2:
Realist
ontology describes ‘a relatively undifferentiated
and continuous topological space undergoing
discontinuous transitions and progressively
acquiring detail until it condenses into the
measurable and divisible metric space which we
inhabit’ (56).]
The key concept is
multiplicity, but this has highly technical
definitions based on different sorts of
mathematics.It is best seen as a radical replacement
for the concept of an essence.One
example of an essence would be to suggest that
what makes humans alike is that they are rational.Deleuze
wants to replace the notion of essence with a
‘morphogenetic process’ (10).Species
are historically constituted rather than
representing timeless categories.All ‘transcendent
factors’ are replaced by ‘form –generating
resources which are immanent
to the material world’ (10).It is
necessary to avoid essentialism at a deeper level,
suggesting similarities of process.‘Multiplicity’
attempts to do this, by specifying ‘the structure
of spaces of possibilities’ (10).
Multiplicity is a concept
related to 'manifold', a way of describing
geometrical spaces with certain properties.Understanding
of manifolds has developed in geometry, beginning
with Cartesian space, a way of locating curves on
a two dimensional space.The
location of each point could be expressed as a
relation between [X and Y values] numbers.Algebra
could then be used to describe processes and
shapes.
Differential geometry (Gauss
and Riemann) developed from the use of calculus.Originally
used to calculate rates of change in quantities
relative to each other, this led to the idea that
geometrical objects such as a curve, could also be
described at each point.The
surface can be studied itself, ‘without any
reference to a global embedding space’ (12).The
surface was seen as a space in itself.Riemann
went on to develop a geometry of N dimensional
space, and the structures in this space ‘were
originally referred to by the term “manifold”’
(12).This
led to completely new ways of understanding space,
which were to be taken up by Einstein and others.Deleuzian
multiplicities are like manifolds with a variable
number of dimensions and with no external unity
imposing coordination, quite unlike essences.
It still necessary to explain
how multiplicities relate to physical processes.Here we
need some theory of dynamic systems.The
‘dimensions of the manifold are used to represent
properties of a particular physical process’ (13),
as a kind of model.One way to model an actual object is to
consider the number of ways in which it can
change—its ‘degrees
of freedom’ (13).These
changes can then be related to each other using
differential calculus [somehow].If an
object like a pendulum can change only position
and momentum, it has 2 degrees of freedom, whereas
a bicycle has 10—five parts such as handlebars,
wheels, crank and 2 pedals, and the possibility of
changing both position and momentum.Each
degree of freedom becomes one dimension of a
manifold, so that the pendulum needs a
two-dimensional plane, and the bicycle a
10-dimensional one.Each concrete possibility is called a state
space, and changes of state can also be
described accurately.This
model is a way of capturing processes, so that we
can map all the possibilities (although there is a
loss of information since each state space becomes
a single point in a trajectory).
We can now use topological
maths to analyse some other features of these
spaces, especially those which described ‘recurrent or
typical behaviour common to many different
models’ (14).These special features are singularities,
and they influence trajectories and thus physical
systems.They
can become attractors for trajectories, for
example, influencing any within their ‘basin of
attraction’ (14), and producing the long term
tendencies of the system.Some
singularities are points, whereas others take the
form of closed loops or oscillations.The
example given [described as an abstract machine in
the book on war]
refer to the classic shapes taken by different
physical objects as they attempt to minimize
bonding energy.The minimal bonding energy is the
attractor, the objects will actually vary
according to their own properties—taking a
spherical form for soap bubbles, and a cubic form
for salt crystals.Such abstract machines are mechanism
independent, and thus ‘perfect candidates to
replace essences’ (15).
Yet multiplicities have more
dynamism, and this helps avoid some essential
understanding of process.Multiplicities
are by no means fixed structures, but unfolding
ones, ‘following recurrent sequences’ (16).The
example here is the fertilised egg which, as it
turns into an embryo, is not following a preformed
path, but differentiating progressively as it
develops—there is no ‘clear and distinct blueprint
of the final organism’ (16).
There is a more precise
mathematical way to describe progressive
differentiation—the theory of groups.Elements
of groups combine in rule governed ways, such as
regular transformations – you can rotate an object
by steps of 90°.To do this with a cube would produce a
certain invariance in appearance, but not if we
rotated it by steps of 45°.However
a sphere would remain invariant—it is described as
having ‘more symmetry than the cube relative to
the rotation transformation’ (17).Classifying
objects by the degrees of symmetry offers a way to
classify them in terms of process, and responses
to events: it is a relative property not an
intrinsic one.It is also possible to change one object
into another by altering its symmetry.The
sphere can become a cube by losing symmetry, or
undergoing a ‘symmetry breaking transition’ (18).
Symmetry breaking can be seen
as a way to explain phase transitions.In some
physical systems, these occur when some parameter
changes—water turns to ice or steam at particular
temperature points.Gas has more symmetry than solids.In the
example of the fertilised egg, development occurs
as a result of a ‘complex cascade of symmetry –
breaking phase transitions’ (18).
When a singularity undergoes
such a transition, it can be converted into
another one, as a bifurcation.Again a
particular critical value of a parameter is
required, a ‘threshold of intensity’ (18).Bifurcations
may take place in a regular sequence, as when
point attractors become loops.The
sequence in flows of liquid offer a familiar
example, as they go from steady state to cyclic
and then turbulent flows, or as heat produces
conduction, then convection currents, then
turbulence.Apparently,
when studied, these observable transitions take a
more complex form.
There are differences between
these physical processes and mathematical
understandings, since the latter is ‘mechanism
independent’ (19).Physical systems have much more specific
sequences of events, although there are some
similarities too.Mechanism independence is what replaces
essences in Deleuze—‘multiplicities are
concrete universals’ (21).These
mechanisms can result in quite different concrete
outcomes, as with the examples of the soap bubble
and the salt cube above, since the similarities
higher the level of processes not product, hence
the ‘obscure yet distinct nature of
multiplicities’ (21).Further,
multiplicities can be seen as meshed together,
‘creating zones of indiscernibility…Forming
a continuous immanent space very different from a
reservoir of eternal archetypes’ (21).
In this way, multiplicities
differentiate (unfold through broken symmetries)
and specify a series of ‘discontinuous spatial
structures’ (‘differenciation’ [with a 'c'] , and
nothing to do with Derrida at all).One
product of differenciation is the normal three
dimensional space (22).We need
to understand what space actually is—the set of
points grouped into neighbourhoods, which vary in
terms of their proximity or contiguity with each
other.Normally,
we think of metric ways to measure proximity, but
in other spaces, such as topological ones, they do
not remained fixed.This requires non metric definitions of
properties like distance.However,
metric spaces can be seen as arising from a
progressive differentiation of non metric ones, as
a symmetry breaking cascade.
This was realised through
developments in geometry, especially non Euclidean
ones.The
example of a manifold has been discussed, but
there are other forms of geometry [such as affine
geometry , and others, ending in topology].Euclidean
geometry can be derived from these later forms,
since they are ‘related to each other by relations
of broken symmetry’ (23).[Different
forms of transformations are responsible, and
metrics do not remain variant with these.Examples
include where some properties such as ‘the
straightness of lines remain invariant, but not
their lengths’ (23).Projective geometry is even more
mysterious, with transformations called
projectivities].Each form offers more symmetry than the
level below—‘as we ascend from Euclidean geometry
more and more figures become equivalent to one
another, forming a lesser number of distinct
classes.Thus,
while in Euclidean geometry two triangles are
equivalent only if their sides have the same
length, in affine geometry all triangles are the
same (regardless of lengths)’ (23). At the highest
level, topology, geometric figures can remain
invariant since only bending, stretching and
deforming transformations are permitted it [the
demonstration on the YouTube
video is particularly helpful here, showing
how a doughnut might be bent and pinched until it
becomes a cup].This is the least differentiated geometry.
In this way, symmetry-breaking
cascades produce more differentiated geometric
spaces, or more structure, [as you go down the
hierarchy].This
could be seen as a metaphor ‘the birth of real
space’ (24).In mathematics, these are purely logical
relations, but there is an ontological dimension
as well.This
is where we need a distinction between intensive
and extensive physical properties.The
latter include metric properties and quantities,
which are intrinsically divisible, while intensive
properties cannot be divided [we can divide
volumes of water but the temperature of the water
does not split into halves].When you
do change intensive properties you change
qualitatively or in kind, as with rising
temperature inducing phase transitions in water.
Objects therefore arise when an
intensive space differentiates itself to produce
extensive structures [there is an example with
quantum physics, 25—the four fundamental forces
arose from a series of phase transitions].
Essences assume that physical
objects somehow receive external forms, but
multiplicities explain how patterns can be
developed ‘without external intervention’(26).The
mathematical models can be seen as describing real
physical processes producing metric space. There is
no need to see change as only arising in the
social constructivist sense [discussed further in
Delanda here]
More work is still needed to replace even the
mathematical metaphors and analogies [next
chapters].
Deleuze uses multiplicities to
achieve some other goals as well, especially in displacing
‘modal logic…[more
in Lof S]
The branch of philosophy which
deals with relations between the possible and the
actual’ (27).If this process involves speculation,
Deleuze proposes certain constraints to guide such
speculation—avoiding essentialism for example.
Deleuze analyses state space in
an original way, in specifically analysing the
relation between the features of state space and
the trajectories which are determined by them, and
arguing that there is an ontological difference
involved [ie we can explain reality that way].It is
possible to use differential calculus to calculate
a specific value for a rate of change ‘such as
instantaneous velocity (also known as a velocity
vector)’, but integral calculus takes these
instantaneous values and ‘reconstructs a full
trajectory or series of states’ (28).Together,
the processes ‘generate the structure of state
space’.The
process begins with experimental observations of
empirical changes, actual series of states, and
trajectories are then created from them.These in
turn are used to predict actual velocity vectors,
constructing ‘a ‘velocity vector field’.We then
use integral calculus to move beyond empirical
observations to generate still further
trajectories as predictions.Together,
these
trajectories produce ‘”the phase portrait” of the
state space’ (28).For Deleuze, there is an ontological
distinction between these empirical trajectories
and the possible ones [the former have to be
actualised?], or between the vector field and the
phase portrait.It is in the vector field that
singularities are distributed in a known manner.
In state space, trajectories
always approach attractors infinitely closely, but
without reaching them.Thus we
never achieve a state of actualisation [it needs
additional features?].Attractors
therefore represent the long term tendencies of
the system, and not its actual states.Nevertheless,
attractors are real and have real effects,
especially in stabilising trajectories.Attractors
also stabilize entire vector fields [by being
distributed among them in a known way—this is
apparently tested by adding a small vector field
and seeing what happens to the distribution of
attractors].Structural stability is typical, but
serious disturbances will lead to structural
change such as bifurcation.
It is now possible to offer a ‘final
definition of a multiplicity.A
multiplicity is a nested set of vector fields
related to each other by symmetry–breaking
bifurcations, together with the distribution of
attractors which define each of its embedded
levels’ (30).In this sense it both offers actual states
and possible ones which are never actualised.Each
multiplicity is not realised when it produces
concrete events, because they are real
already—they actualise.Deleuze
has to then introduce the term virtuality to
describe the status of multiplicities.Virtualities
are also real, since they havereal
effects in the objective world and must be
considered as part of objects.
We also need to deal with the
notion of modality, the ‘ontological
discussion of possibilities’ (31).Classic
modal philosophy runs into problems because it
focuses excessively on sentences expressing what
might have been, but this is entirely speculative,
lacking any sort of structure and thus offering
ambiguity.Perhaps
analyses of state space can overcome these
limitations, because they provide a knowledge of
actual trajectories in state space and how they
are individuate, and they are able to limit
possibilities.Thus the emergence of individual
actualizations ‘is defined by laws…As well
as by initial conditions’ (32), such that only one
individuated trajectory becomes possible.Typically,
though,
there will be many individual trajectories, ‘one
for each possible initial condition’, although
particular combinations may be ruled out (32).There is
still a philosophical debate about the status of
mathematical possibilities like these, and
empirical adequacy might be a test—those producing
actual sequences are held to be real.However,
it might not be possible to isolate individual
trajectories from the global system, which might
contain information about individual trajectories
[I think, 33].
For Deleuze, it is important to
focus on the realist possibilities, but he does
think it necessary to apply some deeper
understanding about how trajectories become
individuated, which means examining the regularity
generated by singularities in the whole state
space [I think -- individuation is a first step in
actualisation,although only some individuations
get actualised?].Nevertheless, it is the actual vector field
which produces actualizations, and so it cannot
just be seen as a mathematical possibility.
Philosophers like to find
examples in classical physics, but these tend to
select simple vector fields relating to linear
systems, where there is only one possible
attractor.Here,
influences on the chosen vector field are minimal.However
in more typical complex examples, there is
distribution of singularities, several attractors
of different types and their 'basins of
attraction'.Is it necessary to describe the
possibilities in terms of virtualities?[Remembering
that this is held to be a real potential, not just
a linguistic possibility].We could
define state space as a set of possible points
instead.However,
the point is to explain how actualizations take
place [remembering that trajectories never
actualize because they never converge on
attractors]— something else is needed beyond mere
possibility.
Perhaps the old category of 'necessity'
can be applied?This category belongs to classical physics
with classic determinism with clear initial
conditions and general laws affecting a trajectory
at each point.The new picture adds singularities in a
more complex way.Now, as many initial conditions can end in
the same end state, the states that trajectories
go through are less relevant and indeed may
fluctuate considerably.Similarly,
the attractor has a stronger role compared to the
initial conditions(35).It is important not to take the simplest
example of single attractors whose basins take up
the entire space [the classic linear deterministic
example].Only
then does determinism specify a single outcome.More
complex space, with multiple attractors, ‘breaks
the link between necessity and determinism, giving
the system a “choice” between different destinies,
and making the particular end state a system
occupies combination of determinism and chance’
(35).There
can be contingencies, accidental disturbances and
shocks, altering the power of individual
attractors.Bifurcations
can alter the distribution of attractors.[This is
where Prigogine, and Coulis, are cited—apparently,
‘chance fluctuations in the environment’ decide
which fork of the bifurcation appears, 35-36].
This is not Deleuze’s actual
argument, but ‘it follows directly from his
ontological analysis’ (36).His
discussion is more general and philosophical.He has
already argued that we must not see virtual
multiplicities as essences, and therefore that
much modal logic has to be rethought, because
specifying possibilities nearly always involves an
essence—and this applies ‘also to those physicists
who seriously believe in the existence of
alternative parallel universes’ (36).For
example, possible universes still assume the
existence of fully formed individuals with the
same underlying identity.
For Deleuze, individuals have
to be explained, and not taken for granted, by
referring to the process of individuation.There is
thus a connection between the boundaries of
individuals and the ‘objective production of the
spatio temporal structure’ (37).The
conventional notion of possibilities does not
explain this process, and thus runs into
difficulties about whether the possibles share the
same essence or not.Deleuze thinks that he has explained
detailed differences in reality instead.
This also helps him avoid
typological thinking [like avoiding essences,
another proscription guiding his speculations].Some of
these involve essences, but not all of them
[Aristotle’s natural states].Botanical
taxonomy
is the best example. Similarities and differences
were tabulated, and higher order relations of
analogy and opposition used to generate new
categories.This
was to be a timeless classification. For Deleuze,
however, these apparently natural relationships,
like resemblance, should be seen as ‘the mere
results of deeper physical processes, and not as
fundamental categories on which to base an
ontology’ (39).It is necessary to explain judgments, but
not by referring to subjective categories or
conventions [as in social constructivism], ‘but
[as] a story about the world...the objective
individuation processes’ (39).
More follows in the next
chapter about species and individuals.Basically,
species sort themselves by natural selection and
then they consolidate through reproductive
isolation.Together,
these factors affect the identity of a species and
how clear cut it is, or whether it can be
hybridised.Again,
the
degree of resemblance is a matter of actual
historical details and developments, not a matter
of fundamental concepts.This
argument can be extended to all natural events,
including the ways in which human beings have
become historically constituted.This
even includes apparently natural elements like
metals—the example is gold, whose properties
actually vary according to the scale of the sample
[individual atoms of gold to do not melt, and
there are various intermediary structures such as
crystals and grains which also have different
properties and these change at particular critical
sizes (40)]
The negative constraints,
avoiding essences and typologies, are accompanied
by positive resources, as in the following
chapter.Generally,
the task is to follow the traces of the virtual in
individual events or entities, and reconstruction
can show how multiplicities can 'form a virtual
continuum’.We
also have to develop the notion of virtual space
and virtual time, and connect with the empirical
laws of physics.Virtuality will be seen to be a useful
construction, replacing laws and essences, and
'leading to an ultimately leaner ontology' (41).
Chapter two.The
actualisation of the virtual in space
The process of discontinuous
transition and condensation [already summarised
above] on page 56 leads to a discussion of
examples that are less metaphorical than
mathematics.The key here is the discussion of intensive
and extensive properties— the first one is ‘like
temperature or pressure…Continuous
and relatively indivisible’, and can produce
change through phase transitions, while extensive
properties are like lengths or volumes ‘divisible
in a simple way’ (56).Again,
the idea is to replace essences and typologies.It will
be necessary also to discuss qualities, as
distinct features of empirical objects, again
requiring a departure from mathematical analogy.
Species and individuals.Species
used to be thought of as categories expressing an
essence or natural state, eternal archetypes.Darwin
replaced this idea by showing that species have
histories.More
recently, species have been seen as individuals,
not kinds, not having a higher ontological status
than actual individuals. Instead of members of
species exemplifying species-level qualities, the
relation is more one of wholes and parts.Interactions
among individuals produce the characteristics of
the whole, as when new breeding patterns among
individuals divide a species into two sub
populations.Species do differ from individuals in terms
of scale, both geographically and temporally. This
is an example of a ‘flat ontology, one made
exclusively of unique, singular individuals, or
differing in spatio-temporal scale but not in
ontological status’ (58).
It becomes important to specify
the processes through which the whole emerges,
which Deleuze argues is intensive.This is
because there are two basic ideas of ‘population
and heterogeneity’, and this implies that
population characteristics are only statistical
averages, quite unlike the essentialist approach.In
essentialism, individual variation is unimportant
and accidental.For ‘population thinkers’ by contrast,
individual variation is ‘the fuel of the
evolution’, with homogeneity an unusual event
(59).There
is no need for archetypes or ideal forms
exemplified in individuals.Instead,
another key concept of Darwinism emerges—‘the norm
of reaction’ (59).This presupposes flexibility between genes
and bodily traits producing different sub
populations—different rates of sunlight or
nutrient, for example, will produce subpopulations
of different sizes.There is no need to insist on one
underlying ideal phenotype.Instead
of rating individual specimens according to some
degree of perfection, we can operate simply with
an idea of ‘relations between rates of change’
(60).For
Deleuze, Darwin broke with ideas of essences in
this double way, substituting populations for
types, and ‘rates of differential relations for
degrees’ (Thou Plats,
60).
There are also structures
between species and organisms—‘demes:
concrete reproductive communities’ (60). They also
feature intensive properties expressed as rates,
such as rate of growth of the deme, which in turn
depends on things like the rate of availability of
environmental resources.Demes
also can have stable states as attractors, and
transitions like bifurcations.They can
also have cyclic properties introduced by adding
factors to growth rates.
Like multiplicities, with the
abstract qualities of differential relations and
singularities, physical examples have
‘counterparts’ [but not resemblances, Delanda
insists] in these rates of birth, death, migration
and resource availability.The
correspondence arises because a ‘given intensive
process of individuation embodies a multiplicity’
[in the sense of taking a bodily form?].The lack
of resemblance is explained by ‘the fact that
several different processes may embody the same
multiplicity’.In this way, multiplicities replace
essences, and intensive individuations that embody
them replace general classes (61).
We therefore have ‘three
ontological dimensions which constitute the
Deleuzian world: the virtual, the intensive and
the actual’ (61).Concrete individuals in actual worlds are
the equivalent of the metric structures which
condense out of the virtual.They can
exist in different spatial scales, providing the
familiar objects in the actual world.However,
actual empirical objects process qualities as
well—such as individual organisms ‘playing a
particular role in a food chain or having a
particular reproductive strategy’ (62).Thus the
intensive has to describe both extensive
properties and qualities.
This can be seen in
embryological processes, which produce not only
definite spatial structures, but qualitative
differentiation of cells into specialist cell
types like muscle or blood.First,
eggs have to produce spaces, initially as non
metric neighbourhoods, defined by ‘chemical
gradients and polarities’ and with fuzzy
boundaries.In
these neighbourhoods, cells begin to cluster
together, but the numbers and location of each
cell is immaterial.It is local interactions between cells
which matters.Aggregates of cells produce either sheets
or migratory groups as stable states.These
two states are connected through a phase
transition.The
outcome is either migration or folding of cells.The
latter produces three dimensional structures.The
processes are subject to changes in rates such as
the birth and death rates of cells.There is
no detailed genetic control, but ‘rather nonlinear
feedback relations between birth and death rates
and the processes of migration and folding’ (63).This
means that no quantitative precision is available
to biologists, [in their explanations] but this is
actually a strength indicating ‘the presence of a
more sophisticated topological style of thought’
(63).This
renders an understanding of the processes ‘anexact
yet rigorous’ [a phrase in Deleuze] (64).
The outcome of migration and
folding is the production of definite spatial
structures, such as bones.The
materials produced also have qualities, however,
such as the ability to bear particular kinds of
loads.These
also arise from intensive processes, this time
involving the production of specialist kinds of
cells, this time from an original set of
‘pluripotent’ cells.As they circulate, these cells exchange
chemical signals which affect their
differentiation, in a process called induction
controlled by regulatory genes, which seem to work
in patterns which act as attractors, each of which
represents a particular type of cell.What
actually happens to a [germ cell] depends on which
attractors exist nearby, a kind of local trigger,
and the degree of ‘stimulus independence’
[and there is a link back to ‘mechanism
independence’] acting at the virtual level.This is
an important ‘part of what defines the traces
which the virtual leaves in the intensive’ (65)].There
must however be lots of possible stable states to
act as attractors, and this will depend on the
connectivity of the network of genes.At
particular ‘critical values of connectivity a
phase transition occurs leading to the
crystallization of large circuits of genes, each
displaying multiple attractors’ (65) [this account
is based on some recent biological work cited in
the references].
Mathematical and biological
models do not literally correspond.In some
of the biological models, physical processes
replace some of the stages in symmetry breaking
cascades.Eventually,
it would be ideal to replace all the mathematical
stages with such physical processes.In
another example, it might be possible to talk
about assembly processes of organisms.Manufacturing
assembly lines are classically metric and rigid,
but biological assemblies are connected by their
typology—for example the specific length of a
muscle is less important than its attachment
points, so the length can grow according to the
length of the bones.Components are transported as diffusion
through a fluid.They randomly collide, and locate each
other through 'a lock and key mechanism' rather
than exact positioning (66).This
form of assembly also permits random mutations to
arise without harming the organism [because there
are many normal combinations also available],
increasing ‘evolutionary experimentation’ (67).
The resulting complexity combines extensive,
metric structures, but also qualities.These
are indivisible as well.
Using the word ‘intensive’
involves an application away from thermodynamics,
and there is a need to incorporate additional
Deleuzian terms to explain how the intensive gets
hidden underneath the extensive and qualitative
properties of an actual product, producing 'the
objective illusion fostered by this concealment'
(69).
The issue is not just one of
divisibility, since this would mean the intensive
and qualitative are similar.There is
also the issue of 'subjectively experienced
intensities, such as pleasure' and the difference
from 'objective intensive properties' (69).One way
to distinguish the intensive and extensive, apart
from divisibility, is to remember that intensive
properties can average out rather than add up, as
when volumes of water with different temperatures
are mixed.However,
average values like this produce certain dynamic
aspects as well, driving some process of
equilibration, or 'fluxes'.This is
an example of a positive or productive difference,
unlike those of extensive properties which produce
relations of similarity and difference.Thus, as
Deleuze says 'difference is not diversity…Difference
is that by which the given is given [or produced]’
(70).
Intensities can produce change
if differences are large enough, producing a steep
gradient, through a phase transition.It is
this positive result that matters rather than the
formal property of not being divisible in metric
terms.In
biological terms, intensive flows take the form of
migratory movements or movements of energy through
a population.Genetic differences are also equivalent
[‘an extension of the original notion of intensive
gradients, but…nevertheless related’ (71)].However,
in biological populations, there are much more
opportunities to interact than in thermodynamic
ones.In
particular, biological organisms have capacities
which have no equivalent.
Capacities refer to the
potential ‘to affect and be affected by other
individuals’ (71).In chemistry, carbon has a much greater
capacity to combine with other elements than inert
gases.Biological
components are assembled very flexibly, and have
even more ‘combinatorial spaces’ (71).This
notion of greater possibilities alludes to the
virtual.Deleuze
actually refers to the virtual and the intensive
as possessing singularities and ‘affects
(unactualised capacities to affect and be
affected)’ (72).
Singularities have been well
studied, but affects less well so [so DeLanda
pursues some parallel work].One
approach involves the idea of capacities to form
novel assemblages, and it is possible that there
may well be universal recurrent patterns [the
example refers to ‘random grammars’ and
‘algorithmic chemistry’ (72)].Certainly,
adding the idea of capacity takes us away from
classic thermodynamics definitions of intensive.Forming
assemblages is one way to think of capacity [and
the example is the walking dog, forming an
assemblage with the ground and a gravitational
field].Capacities
cannot
be reduced to the properties of the interacting
individuals, and seem to emerge unpredictably. A
similar notion is ‘affordance’, which also
stresses the relational nature of capacities,
being released only when dogs relate to ground.Spatial
scale affects the interaction.Affordances
are also symmetric [reciprocal]—dogs may hide in
holes in the ground, but also dig holes of their
own.
The ability to ‘articulate
heterogeneous elements’ can also be seen as a part
of intensivity (73).The notion ‘extensive’ can be extended to
include the articulation of homogenous components.This
helps us explore the crucial notion of difference.Intensive
processes
preserve positive differences, ones which generate
further differences, as in evolution.
Objective illusion.It is
common to find the intensive concealed ‘under’ the
extensive, and the ‘concrete universals
(singularities and affects)’ which drive the
intensive.This
is easier to see in assemblages where intensive
differences remain, where homogenisation does not
take place.Scientists
often themselves systematically homogenise
processes, however, or study systems in
equilibrium there are differences cancel
themselves out.This explains the persistence of the
objective illusion.The problem is exacerbated if physicists
only study final states [and linear systems].There is
both an objective and subjective impulse towards
this objective illusion [this looks like classic
Marxist notion of ideology, where both subjective
interests of economists and the misleading surface
appearance of economic activity both contribute to
misunderstanding] (74).
It is important to study
systems which preserve intensive differences.An
example here is work focusing on ‘the field of
far–from–equilibrium thermodynamics’ (75).Such
systems maintained flows of matter and energy.It is
also useful to study non-linear systems with
multiple attractors—since these are easy to push
away from equilibrium [and the work of Prigogine
and Nicolis is cited, 75].Such
systems continue to display virtuality, in the
form of potential alternative states, which can be
produced by shocking the system.[Here,
and later, a
distinction seems to be appearing between
complexity and mere diversity.The
former relates to the virtual and its potentials,
while the latter seems to relate to actualizations
and their combinations?If this
is so, the politics of complexity might lead to
clarify whether it is talking about complexity as
such or mere diversity—diversity would certainly
be easier to manage and change?In any
event, it is not enough just to cite an analogy
with a complex system, as Osberg does with
Prigogine—we need a much more detailed analysis,
as DeLanda argues
at the end of this chapter].
However, the virtual also
appears best in systems with high intensities [low
intensities are associated with equilibrium].Again,
physicists often prefer to study systems at low
intensity values, and thus help to promote the
objective illusion again.
Deleuze says we must penetrate
beneath the objective illusion using a
philosophical method, focusing on constituting
processes responsible for consistency, ‘to go back
up the path that science descends’ [What is Philosophy,
76].We
need to trace back qualities and extensivities to
the intensive processes that produce them, and
then back to the virtual.DeLanda’s
example
turns on biological classifications again. We should
study objects like the tetrapod limb not by
classifying it according to the common properties
of limbs, but examining the process whereby they
get produced, a matter of ‘asymmetric branching
and segmenting’ (77).In other
words, a virtual limb is unfolded through
particular intensive sequences, including
bifurcations and blocked by vacations.This
will take us back to the intensive, but we need
also to get to the virtual: the mathematical
analogy of the relation between topological and
other spaces helps here.
Extensive structures can be
seen as occupying the bottom (metric) level,
intensive processes the intermediate [eg affinal]
level, and a virtual at the topological level.No
hierarchy is being claimed here—‘a nested set of
spaces’ is a better description (78).Each
space in this case needs to be defined by its
affects, whether it affects or is affected by
specific operations such as rotating, folding and
so on.
The virtual as a continuum.Discontinuous
individuals in the actual world are
differentiations of this continuum.This is
no simple topological space, however, but a
‘heterogeneous space made out of a population of
multiplicities, each of which is a topological
space on its own…a space of spaces, with each of its
component spaces having the capacity of
progressive differentiation’ (78).What
then would mesh these different spaces together
into a ‘plane of consistency’ [nothing to do with
logical consistency, DeLanda says, but rather
defined as ‘the synthesis of heterogeneities as
such’ (78) [apparently explained in What is
Philosophy.It starts to look a bit circular here,
since the need to be consistent is rooted in the
very definition of the plane, but the explanation
of consistency assumes that there is such a plane
that can synthesise?].
We can progress by considering
the characteristics of the objects of populate the
virtual, especially multiplicities.The
mathematical definition takes us only so far, and
we have to abstract from the mathematics to get
philosophical concepts [that are content free, or
pre-individual].If each singularity is extended into an
infinite series, multiplicities can be meshed, but
we have to spell this out a bit more detail.
The need to extend
singularities arises from considering physical
systems with multiple attractors, or biological
assembly processes with many combinations.Both of
these cases allude to the virtual, since neither
can be precisely described numerically, and novel
assemblages are always possible.How can
we explain these unactualised capacities?
Abstracting from mathematical
notions such as a function is one step.Functions
normally model systems in terms of input and
output variables, the latter indicating a
particular state in a state space described by the
former.However,
we do not wish to ‘presuppose individuality’ (80).[We want
to maintain the idea of the virtual].We
therefore need some idea of a formless function,
without values as such, but referring only to
rates of change [this reminds me of the importance
of the calculus for Deleuze, where the relation is
preserved even when the actual values are zero in
both cases].In this way, virtual relations can be
thought of which only determine each other, with
no outside determination, and no actual content.‘Virtual
singularities should be distinguished from
individuated states’ (80).Singularities
like this have pre-individual characteristics [it
is a mistake to see attractors as special points
of state space].
We have to think of
singularities as defined by vectors.Vectors
are not individuated states, but ‘instantaneous
values for rates of change’ (80).Occasionally,
vectors are stationary, as a ‘topological
accident’, and these have the status of events.But not
even these are actual events for Deleuze, but
ideal events—‘turning points and points of
inflection…pre-individual,
not
personal, and a- conceptual.Singularity
is neutral’ [that is lacking content?] (81).
[The quote above comes from Logic of
Sense, and DeLanda has an interesting note
on it. The
actual events, compared to an ideal one
appear 'as a more fleeting and changing
individuation.Deleuze argues that events have the
individuality of a haecceity…The
unique singularity of a moment.There is
a quote from Thousand Plateaus:
“individuality [can be] different from that of a
thing or subject…[and]…consist entirely of relations of movement
and rest between molecules or particles,
capacities to affect and be affected”'.DeLanda
goes on to say that the walking dog assemblage can
be one of these, a concrete event, involving
specific animals and specific conditions, rendered
by Deleuze and Guattari as “This should be read
without a pause: the animal–stalks–at–five–
o’clock”. This
event can be described in terms of ‘relations of
rapidity and slowness: the ground affords the
animal a solid surface only because relative to
the speed or temporal scale of change of the
animal, the ground changes too slowly’.By
comparison, at the virtual level, singularities
are also haecceities, annoyingly enough, but
speeds and affects are different—producing an
accidental moment in a field of velocity vectors
which provides some sort of insulation from the
transformations going on in the rest of the field.So is a
haecceity in the first sense a moment of fixity
that arises from some sort of empirical
combination, whereas at the virtual level, it is a
unique interaction of vector fields?].
So, getting back to the notion
of extension, a series of ideal events stretch out
from each multiplicity.To take
another analogy, there are phase transitions in
water at temperatures of zero and 100°, but a
series of ordinary events in between them, which
only produce limited linear effects.At the
virtual level, singularities in multiplicities
produce a series of ‘ordinary [but still] ideal
events extending up to the vicinity of other
singularities belonging to other multiplicities’
(81).
Naturally, they must not be
separated by some sort of metric scale.To
detour into mathematics again, there are clearly
cardinal series and ordinal series of numbers.Ordinal
scales are not metric but relational.However,
ordinal scales are related to numerical ones, [as
topology is related to geometry?].Another
quality of ordinal scales is that they cannot be
added together to cancel their differences.Deleuze
thinks that this is an ontological matter, where
the ordinal actually constitutes the numeric
‘through a symmetry breaking discontinuity’ (82).Thus we
can say that the ordinary events between
singularities are only minimally actualised,
separated from each other by ordinal scales.The
strings of events can be woven into a continuum,
again avoiding individuation and anything
concrete,, producing communication between them,
and also a proliferation.As
usual, we wish to avoid all those relations like
similarity and analogy.
Multiplicities do not actively
interact with each other because they are
conceived of as essentially impassive, neutral or
sterile.They
are independent of any particular mechanisms, and
may be affected by several causal mechanisms.There
has to be some causal mechanism, or they would
float off into the transcendent, and Deleuze wants
to insist they are immanent.As
usual, we have to think of an unusual possibility,
that they are incorporeal themselves, but affected
by corporeal causes, ‘historical result of actual
causes possessing no causal powers of their own’
(83).They
are in quasi causal relationships.[This
gets very close to the delirious and obsessive
detail of Anti Oedipus, where
one thought has to be immediately justified or
qualified by further and further refinements and
details in order to preserve the implication of
the first term and fight off rivals.Deleuze
seems to have his own rigorous philosophical rules
to guide this obsessional pursuit of
implications].We have to work with the dubious notion of
quasi cause to account for the invariant
properties of multiplicities, which must be
produced ‘by at least one operator’ (84).
Naturally, this quasi causal
operator cannot be seen as working in the usual
way, which would involve too much individuation.It can
only have ‘the most ethereal or least corporeal of
relations’, defined as ‘”resonances or echoes”’
(84).We
can begin to understand this further by looking at
abstract communication theory [which also avoids
content or individuation by referring only to a
possible link between two series of events with
different probabilities—if one change in one
series affects the probability distribution of the
other, information is said to have been
transferred].Even this is not abstract enough, however,
and Deleuze specifies that the connection cannot
be numerical, it must be a matter of ordinal
distance, and communication should take place only
via ‘the difference between the singular and the
ordinary, the rare and so common, without further
specification’ (85).This will be discussed in subsequent
chapters.
Has Deleuze gone too far?Other
phenomena can be studied empirically.Even
symmetry breaking cascades can be checked against
empirical findings.But purely intensive processes and their
effects cannot be.Nevertheless, we are moving away from the
usual argument that eternal essences can be
grasped a priori.Deleuze’s
scheme features ‘concrete empirico–ideal notions,
not abstract categories’ (86).There
are hints at least of a quasi causal operator in a
new field studying ‘emergent computation’
[discussed further on page 86].The
simplest example [!] refers to what happens in
materials near phase transitions—apparently,
separated events in the system can communicate, by
fluctuating around a given state.Near
phase transitions these fluctuations begin to
correlate, and thus transmit information in the
very abstract sense.Again, this seems to occur in a variety of
materials, displaying the desirable ‘divergent
universality’ [or mechanism independence] (87).It may
be that living organisms have acquired this
capacity, and as a result, evolution has kept them
at the edge of phase transitions rather than at
equilibrium.There is a notion of populations of cells
as ‘poised systems’ (87) [compare with the notion
of a trembling organisation?].All this
is still controversial and contested, but there is
enough, Delanda thinks, to justify ‘postulating
such an entity as a quasi causal operator’ (88).
Although Deleuze’s ontology is
unfamiliar, it is at least very detailed—and
speculative.He is prepared to offer an alternative
explanation to the notion of essences, and one
that does not just refer to social conventions.He has
given a detailed description of the process of
individuation, and offered this in the form of a
discussion of mechanisms of immanence, not
transcendental concepts.Overall,
he has attempted ‘to explain how the virtual is
produced out of the actual’ (88).[Actually,
this is a good point.Essentialism
does not really explain how essences are connected
to embodiments.Come to that, social constructivism doesn’t
really explain how construction occurs—they need
at least some mechanism like habitus? How about
critical realism or structuration approaches which
also operate with a virtual level -- their
mechanism is only human agency?]
Chapter three. The actualisation of
the virtual in time
There seem to be two notions of time in physics,
turning on whether or not there is a fundamental
asymmetry between past and future—in
thermodynamics there is, but in classical physics,
including relativist physics, there is an
invariance.The
issue turns on whether it makes a difference to
the values concerned.In
thermodynamics, running the system one way leads
to diffusion then equilibrium, but running it the
other way leads to an increasing amplification.Behind
these differences are differences in the ways in
which the laws governing the processes vary.
The significance of laws
appears in the next chapter, but there has been a
tendency to keep the laws intact in a way which
allows for symmetry, but which requires
irreversibility to be explained [an interesting
note give some examples of how this works,
including seeing the directionality of time as a
subjective effect, or a contingent effect, page
137].However,
nonlinear processes and far from equilibrium
states have brought irreversibility back to
prominence [and Prigogine has been important here,
citing Bergson, and wanting to maintain the
positive power of time in becoming, 105].
Deleuze also wants to insist on
becoming without being, where individuals arise
from an irreversible process of individuation.Bergson
has influenced both.Generally though, Deleuze wants to extend
the discussion into the same terms as used when
discussing space, how a ‘temporal continuum…through
a symmetry breaking process yields the familiar,
divisible and measurable time of everyday
experience’ (106).We need a discussion of extensive and
intensive time, the former divisible into
instants, especially sequences of cycles of
different kinds.The sequence is are traceable to underlying
processes.These
are complex, with oscillations of various lengths
nested within each other.Intensive
characteristics will explain how metric
temporality emerges—this will draw on work that
shows how external shocks can or induce various
oscillations.Generally, sequences of oscillations will
show both singular and ordinary events, which
reveals intensive factors at work.
Extensive time.There is
a nested set of cycles in a flat ontology.The
spatial scales involved have already been
discussed, but there are temporal scales too,
arranged as before in a variety of structures
ranging from individuals to the virtual—‘an
individual typically displaying a spectrum of
timescales’ (107) (daily monthly and yearly
cycles, or reproductive cycles and so on).These
cycles can overlap, but it is conventional to
assign ‘a particularly prominent timescales’ to
each individual level’ (such as our lifetime for
an individual member of a species).
Can these cycles be seen as a
result of symmetry breaking processes?There
happens to be a particularly suitable bifurcation
(the Hopf bifurcation) explaining how steady state
attractors become periodic ones.The
example to illustrate this turns on a spatial
analogy involving transition from a gas to a
crystalline state [it is complex, page 108.In
spatial terms, there is a loss of invariance
following this process.The same
goes for time distributions, apparently, and a
displacement can produce ‘a sequence of cycles
that is out of phase with the original one’.Again
Prigogine and Nicolis are cited in support.They
seem to argue that instead states, we can ignore
time, but in periodic motions time becomes
important, and this can be referred to as a
breaking of temporal symmetry].
Linear oscillators can be
explained by looking at the details of their
initial conditions, and they are typically
regular.However
nonlinear oscillators do not depend on extrinsic
constraints, and are better seen as pulses of
action, each emerging from its past [difficult
stuff again, page 109].Their
cycles can be seen as occupying a nested set as
before, implying that time unfolds ‘pulse by
pulse’ rather than offering some universal scale.The
normal metric notion of time would be one
possibility in this unfolding she if, arising from
a particular oscillation or a nest of them.Deleuze
talks about particular syntheses of time as lived
presents, where the past and the future are
contracted—he calls this ‘Chronos’, and notes that
only the present is important, with the past and
the future related to it.There is
apparently a vast extended present involved [the
reference is to Logic of Sense].Delanda
explains this by saying that because the different
scales involved, past and future in a quick
oscillation is still only the present in the
longer one—geological time for example.Even
biological organisms ‘have many past and future
events for oscillate as operating at atomic and
subatomic scales’ (110).It
follows that normal metric extensive time also
contains shorter cycle oscillations, and thus it
is ‘”composed only of interlocking presents”’,
quoting Deleuze as above, page 110.Although
Deleuze sometimes refers to ‘lived presents’, he
intends no psychological element—‘this is simply a
matter of convenience of presentation and not
fundamental to his account’ (110).
A classic example from
relativity theory reveals the objective dimension.It is
not just that the space travelling twin looks less
old to the earth -bound observer —it is the atomic
oscillators in the cells of each person which are
‘objectively affected’ in the travelling twin
(111).
‘Lived present’ can be better
understood by a relation between timescales and
capacities.We
know that spatial scales affect capacities—that
small insects may walk on water, but not larger
mammals.Timescales
are similar, affecting the perception of change—so
extremely slow cycles appear to be not moving at
all [so everything really is becoming, including
things like mountains or planets that appear to be
fixed to us].Extremely fast oscillations can mean that
phenomena are irrelevant.Subjective
experience
simply interprets these objective relations.
There is also relaxation
time—the time taken to settle into a stable
periodic state, to be recaptured by an attractor.This
will vary between phenomena, ‘and in each case
they display characteristic timescales’ (111).Relaxation
time can also affect affordances, seen best in the
curious materials called glasses which are really
liquids flowing extremely slowly.Observers
will be able to detect a flow in glass given
‘sufficiently long observational times’.Again no
psychological or subjective qualities are implied,
simply a relation between observation and
relaxation times, an interaction.In this
sense, we can replace the observer with some other
material and refer to ‘how the glass “appears” to
it’ (112).The
materials can interact as with affordances—liquids
can flow around glass and also erode glass.Thus
these capacities are affected by relative
timescales, especially relaxation times. This
helps understand ‘lived presents’, how individuals
perceive their own timescales relative to the
capacities of others.Even
inorganic things can have a lived present,
however. The present therefore is a product of
oscillations, movements and the type of matter
involved.
Intensive aspects of temporality.Intensive
properties are involved in the production of
individual oscillations.Some
work in biology is cited, page 113, on how the
internal clocks of various organisms can be
shocked, producing a halt in the oscillation.The
shock itself triggers the death of an oscillation,
depending on its internal and intensive structure.Some
shocks will completely annihilate them selection
if there is already a stable steady state
attractor [which kind of replaces the
oscillation].The shock can produce ambiguous behaviour,
arrhythmic patterns if the attractor is close to a
phase singularity.It is also possible to shock systems into
creating oscillations around a phase singularity.The
results seem to indicate some intensive mechanism
independent tendency, applying to biological
organisms and even inorganic chemical reactions
(113).
Oscillators can also
synchronise or entrain temporal behaviour (114).
This is similar to the capacities to form
assemblages between heterogeneous individuals.Such
assemblage presupposes certain qualities of an
individual, but also an adequate ‘outside’ [or
environment?].Entraining appears when for example
organisms synchronise their sleep cycles with
cycles outside themselves, such as the day cycle
of the planet.[However, apparently purely physical
oscillators can also entrain]Isolating
animals
from the external cycles can reveal an autonomous
internal cycle—25 hour one for human beings,
apparently (114).Planets' rotational periods synchronise
these internal cycles and can do so flexibly for
different organisms.Again this is typical of an intensive
process, both stimulus and mechanism independent.There
seem to be weak coupling signals involved, but
they must operate at a particular level of
intensity or strength.
Temporality is therefore
sequential, a sequence of oscillations.But
within these are singular and ordinary moments [in
the biological work being cited, singular moments
are those where phase transitions can occur?Or are
there other ‘sensitive points’ as well, where
oscillations can be seriously affected?].There is
also the notion of parallel structures in time,
already implied by the notion of entrainment,
where several oscillations act in unison.The
spectacular example turns on the characteristics
of a slime mould, which can take the form of
individual amoebae, then aggregate into a single
field, and eventually into a single organism, the
process being controlled by a critical levels of
the availability of nutrients (115).
So far, then, we have seen an
abstract symmetry breaking event, a Hopf
bifurcation, and some experimental results from
biology.This
is the same sort of process as in the previous
chapter, where abstract models have to be made
physically plausible, and, as a result, complexity
[in the ordinary sense] must be introduced.It is
also necessary to define a virtual continuum.Something
similar has now to be done to explain the birth of
metric time.
We know from the biological
work how important it is to develop critical
timing and parallelism, but biological evolution
might also benefit from the explanation of
novelty—parallel developments and complex
relations between, including relations of timing,
can lead to different processes of acceleration in
parallel developments, and this can produce
novelty.Rates
of change and couplings are important in
embryology, and time is involved in many of these
notions of important rates.Different
rates of change can affect each other as in the
example with affordances and relaxation times.Even
processes operating at similar scales can affect
each other according to their rate of change [the
example given is processes that produce patterned
skin, which depends both on concentration of
chemical substances and how quickly they react as
they diffuse through an embryo, 117].There
are important rate-independent phenomena too, such
as the information contained in the genes, which
is not itself decoded at different rates, but
produces enzymes with different controlling rates
[this leads to discussion about whether or not
genetic action can be seen as some kind of
computer program—if so, it is a parallel
processing network—117-8].
This replaces the idea that a
novelty can only be added at the end of the
sequence.Instead,
‘new designs may arise from disengaging bundles,
or more precisely, from altering the duration of
one process relative to another, or the relative
timing of the start or end of the process’ (118).This is
apparently called heterochrony, and one biological
result is neotony, where sexual maturity exceeds
the rate of development of the rest of the body.
The idea of parallal
disengagements further leads to the argument that
some evolutionary change involves simplicity not
additional complexity, which helps remove the
accusation of teleology from Darwinism—Deleuze
noted that progress can occur from simplification,
from the loss of components and so on (118).At the
end of embryological development, the relatively
fixed anatomy conceals these intensive processes,
although some remain, as in the ability to
self-repair.And even the most finished individual can
take part in other intensive processes, such as
those in ecosystems.
Ecosystems are assemblages of
heterogeneous species.The
population density of interacting species can
vary, and this is another intensive property
featuring phase transitions.An
environmental shock, for example, can produce a
relaxation time until equilibrium is regained—the
resilience of a population.This
sort of intensive property can individuate a
species.There
are also different timescales operating
simultaneously, such as the birth and death rates
of a population and the interaction with those of
others, as in a food chain.Amplification
effects can arise affecting relaxation time,
according to the degree of connectivity between
the species.Environmental changes can produce longer
oscillations.The whole can be seen as a network of
parallel processes.
However, an ecosystem can also
develop temporal cycles associated with evolution.Evolutionary
rates
are no longer thought of as uniform or linear, but
feature accelerations and decelerations.This can
also produce novel designs, as when a species
becomes extinct and new niches are available.Another
affect can be symbiosis, meaning not just a
beneficiary relation between two partners, but ‘an
assemblage of heterogeneous species which persists
for long periods…and which typically leads to the emergence
of novel metabolic capabilities in at least one of
the partners’ (121).This can be a form of coevolution, where
the partners exert selection pressures on each
other.Symbiosis
occurs at different scales: the cellular level,
where cells have combined to generate
photosynthesis (121); or where micro organisms
cooperate with animals or plants in digestion or
nitrogen fixing.Both of these examples feature accelerated
processes which show ‘meshing of the capabilities
of two or more heterogeneous populations’ (121).
Deleuze normally refers to
singularities and affects, but sometimes he refers
to speeds
and affects, ‘speeds of becoming and
capacities to become’ (122).Parallel
processes can be defined in terms of relative
speeds and rates of acceleration or deceleration.This can
become an evolutionary strategy, allowing an
individual ‘an escape route from an
overspecialised design’ (122).[A kind
of biological version of becoming and escape].Coevolution
can be rendered as a composition of speeds and
affects.Symbiosis
similarly can enable ‘a fully formed being [to]
cease to be what it is to become something else,
in association with something heterogeneous on the
outside’ (122).
This helps flesh out the
abstract notions of timing and duration, as
promised.A
further complication arises when we consider that
time is always combined with space to produce
‘spatio- temporal phenomena’ (122).This
helps see that the emergence of metric properties
occurs simultaneously with space and time in a
single process, preserving the flat ontology.Virtual
space time involves the same elements in Deleuze—a
non metric continuum, changing populations of
virtual multiplicities and a quasi causal operator
which assembles a plane of consistency.This is
speculative, but it does preserve ‘an empiricism
of the virtual, even if it does not (and should
not) resemble the empirical study of the actual’
(123).Deleuze
has at least posed the problem of how to derive a
virtual ontology while avoiding essentialism and
typologies.He
has at least pointed to the need to provide a
mechanism of immanence.
The quasi causal operator is
such a mechanism.It provides ‘a minimum of actualisation’
for virtual multiplicities by prolonging them into
a series of ordinary [but still ideal] events, and
suggesting a series of convergence and divergence
between them, using abstract communication theory
as before.(123).Such
communication can arise spontaneously in poised
systems, even inorganic ones.In
parallel processing networks, other critical
levels may produce communication ‘in the
neighbourhood of a critical point of conductivity’
(124) Both embryos and ecosystems may also need to
be poised to maximize communication.
However, ‘changing
distributions of the singular and ordinary’ can
also be produced by information transmission.Virtual
series can only be based on an ordinal scale, and
statistical distributions cannot be conceived as
fixed [if we are to preserve Deleuze’s interest in
the virtual as minimally actual or individuated].Instead
there must be ‘mobile and ever changing (“nomad”)
distributions in the virtual series, establishing
both convergent and divergent relations between
them’ (124), produced by the quasi causal
operator.
This operator condenses
singularities by producing communications between
the series emanating from every singularity,
linking them, differentiating the series,
‘ensuring they are linked together only by their
differences’ (125) [the note on page 145 expands
this view and says that Deleuze also thinks in
terms of entrainments arising from initially weak
forces.The
homely example considers two pendulum clocks
initally transmitting weak signals to each other,
through vibrations in the floor.Once
resonance is achieved a much stronger connection
emerges.This
notion
of resonance is used to explain the action of a
quasi causal operator {and this also explains why
Deleuze is always banging on about vibrations,
multiplicities as vibrations, and how they connect
with each other avoiding resemblances and
analogies and all the other forbidden terms}].
This explains the spatial
characteristics of the virtual as a mesh, but
there is a temporal dimension too, which Deleuze
calls ‘Aion’ [the prat].This
refers to a [mysterious] time of decomposition
where spaces become sub spaces.Spaces
condense singularities, but time is needed to
complete this event [or something!125].Deleuze
uses the term ‘adjunction’ here, and this is
borrowed from group theory—the ‘” adjunction of
fields” is an abstract operation very closely
related to the idea of the progressive
differentiation of space through a cascade of
symmetry breaking transitions’ (125).The
point seems to be to differentiate this process
from others such as bifurcation, which feature a
sequence of events and stable states, only one
alternative of which is actualised [the example is
the convection cycle again and how either
clockwise or anticlockwise rotation is produced,
but not both].There may well be unstable options
produced, but they don’t last long.In
virtual unfoldings, however events can coexist
rather than following each other, and ‘each broken
symmetry produces all the alternative
simultaneously, regardless of whether they are
physically stable or not’ (126).
Thus virtual forms of time
involve absolute simultaneity.In
relativist physics, two events cease to be
simultaneous if they become separated in space,
but Deleuze's conception goes further.In
virtual space there are no metric distances, so
that the idea of a stretch of time becomes
meaningless [you further need the argument that
‘ordinal distances…join rather than separate events’ (126)].
At the virtual level, not even
the laws of relativity apply.Instead,
the very ‘temporality of the laws themselves’ are
produced (126).Although physics does not normally worry
about the ontological status of its fundamental
laws, philosophers commonly understand them as
universal essences and thus as timeless.But
Deleuze’s notion of the virtual sets out to
replace this philosophical understanding.
What does a non-metric form of
time look like?It would be unlike any present time, even
the longest cycle.It cannot be entirely timeless, since that
would involve essentialism.One step
involves seeing the virtual as ‘populated
exclusively by pure becomings without being’,
which avoids timeless being (127). A pure
becoming, unlike an actual one, must lose
any trace of sequentiality or directionality.The
analogy is with phase transitions, especially
unactualized ones.At the critical event, such as 0° C, water
neither melts nor freezes, and both states are
actual becomings.By contrast, ‘a pure becoming…would
involve both directions at once, a melting –
freezing event which never actually occurs but is
“always forthcoming and already past”’ (quoting Logic of Sense
and other references, note 58, 148) (127).All
events in virtual space, including the unfolding
of multiplicities and their prolongation into
singularities are pure becomings.Thus,
time itself can be seen as unfolding, as a pure
order.It
has no present, since having a present would be to
stop becoming.It is instead ‘an ordinal continuum
unfolding into past and future, a time when
nothing ever occurs but where everything is
endlessly becoming’ into the past and into the
future (127).Pure becoming is symmetric, with the normal
direction of time only appearing when symmetry is
broken in the process of actualisation.
Just as multiplicities are
neutral or ‘causally sterile’, so is pure becoming
as a temporal dimension (127).However,
just as the quasi causal operator has capacities
to affect multiplicities, ‘acting in parallel with
physical causality’ (127), and producing a mesh of
interwoven multiplicities, so it has a temporal
aspect as well.This goes on at a virtual level, so no
normal passage of time is involved—‘this other
time must indeed be conceived as instantaneous’
(128).[This
seems to be a parallel here with the notion of
pure becoming, pure instant].Normal
time always features of limited duration [a
cycle], while the series of cycles can go on to
infinity.However
virtual time is unlimited in its duration, but
finite [instantaneous].What the
quasi causal operator does is to bring about an
event of zero duration [with an incomprehensible
quote from Deleuze Logic of Sense, 128, possibly
meaning that there must be some form of minimal
actualisation again?The quote mentions Aion].
Delanda admits that this
conception of time needs work, unlike the parallel
discussion of space.Even the discussion of space seems
unnecessarily speculative—why not just use
nonlinear mathematics with its notion of
attractors and bifurcations?However
to do so would invoke essentialism again, this
time in the notion that platonic ideas were being
described by mathematics [a problem for Osberg who
just uses Prigogine’s biology as a model for
complex organisations].A
mathematician is cited in support of this view
that nonlinear mathematics should not be seen as a
self organizing system theory for everything
(129).Deleuze
would be against any attempt to develop some
eternal taxonomy, which leaves him with no choice
but to push on to speculate about complex
mechanisms for meshing at the most abstract level.
It could be argued that it
would be simpler to go for nonlinear mathematics,
but this would be ‘an illegitimate use of
simplicity’ (130).Platonism might be more familiar, and there
are no other well known attempts to specify
mechanisms of immanence, but that is no reason not
to proceed.The
complexity is not over yet anyway.We have
seen how a continuum could be built from a
population of multiplicities, but where do those
multiplicities come from?They
must have been produced, otherwise they would look
like essences.We need yet another immanence mechanism to
explain them, another task for the quasi causal
operator, to extract multiplicities [from
singularities ?] (130).Deleuze
uses the term section or slice here.This is
a mathematical operation which reduces the
dimensions of the object [the bit about the circle
being a slice through a sphere, a sphere being a
slice through some four dimensional figure and so
on].Mathematics
already uses this term to analyse attractors, by
slicing a complex topological shape in order to
simplify for study.
Deleuze has different notions,
though.Taking
the example of flow patterns in liquids, empirical
studies would see the attractors as the effects of
actual causes such as temperature gradients.However,
it would be possible to sample or slice through a
system like this in order to get the entire set of
attractors defining each flow pattern, and all the
bifurcations which mediate the patterns.What
this does in effect is to strip away all the
empirical detail, leaving only the topological
invariants, ‘the distribution of its singularities
as well as the full dimensionality of its state
space’ (131)
This is based on the idea of N
dimensional manifolds discussed in chapter one.These
manifolds have different dimensions defining
relevant degrees of freedom or ways of change.Each
actual multiplicity sampled empirically offers a
specific value for the dimensions, since
empirically only a finite number of ways of
changing is possible.Each
multiplicity would have a different value for the
number of dimensions.When
multiplicities are meshed together on a plane, a
dimensionally diverse population emerges, ‘a space
of variable dimensionality’ (131) [the idea of a
plane is a bit misleading, since it implies only
two dimensions] [the Deleuze quote seems to go
back on this by talking about multiplicities being
flattened when joined in the plane of consistency.The
quote is from 1000
Plateaus, so it might mean anything --but
see below].To
add to the confusion, sometimes the quasi causal
operator does this slicing, and sometimes the
plane of consistency itself: ‘the difference
between the two formulations is, I believe,
unimportant’ (132).[I’m seriously confused here, I confess.Do
multiplicities have to be sliced before they can
be joined?].
For Delanda, the point is that
when multiplicities are joined they are still
heterogeneous.However, the quasi cause ‘would operate at
N-1 dimensions’ [which seems to imply the slicing
process?].This
at least avoids the idea of the transcendental
source of unity which would require N+1 dimensions
(132).[The
quote from Deleuze (What is Philosophy) puts this in
different terms, saying that a multiplicity can
never be overcoded with a supplementary dimension,
but that they fill all their dimensions already.Any
change seems to have to come from the outside,
involving connection with the other
multiplicities, and this is described as ‘the
abstract line, the line of flight’.It is
possible that this common filling of all available
dimensions, and lack of overcoding is what makes
multiplicities capable of being flattened on the
plane of consistency, regardless of the actual
number of dimensions they possess?].
To summarise [!] There are two
immanence mechanisms in the quasi causal operator.There is
pre-actualisation, where multiplicities are
assembled together, or rather their ordinal series
are, with relations of convergence and divergence.This
would poroduce minimal actuality and the first
broken symmetry that will lead to full actuality.Secondly,
there is a counter actualisation, following from
the actual extensive and qualitative back to the
virtual.This
unflattens multiplicities, allowing them to unfold
and differentiate again.This
involves sampling all actual events, and unfolding
them into the past and future, redistributing the
singularities of the past and future which account
for the different levels [very confusing], (133).
Preactualisation begins the
process of actualisation by giving multiplicities
limited autonomy from the intensive, and a basic
power to actualise.At this stage, singularities could exist as
a potential alternative state.It would
begin the process which leads ‘down the symmetry
breaking cascade’.In this operation, the quasi causal
operator becomes known as the ‘”dark precursor”’,
quoting Difference
and Repetition, 133.With
counter actualisation, the process works up the
cascade from the intensive towards the virtual.Sometimes
this is a spontaneous process revealing the
virtual underneath the extensive.It can
be seen as something that ‘accelerates an escape
from actuality which is already present in some
intensive processes, the quasi causal operator is
referred to as a “line of flight”’, quoting 1000 Plateaus.[Note 77
on page 151 says the lines of flight may be
relative or absolute.Relative
ones are found in actual assemblages, as in the
examples from embryology and ecosystems,
reflecting affects’ and relations of speed and
slowness, and allowing an escape from rigid
morphologies.This is only a relative escape though, and
absolute lines of flight require these relative
escapes to be boosted so that they leave the
intensive altogether and head for the plane of
consistency.Indeed, a further Deleuze quote says that
these lines of flight actually create the virtual
continuum].
Delanda ends by saying that
even though these proposals might be speculative,
at least Deleuze has asked the right questions and
discussed the constraints if we want to abandon
essentialism and discuss immanence mechanisms.There
are apparently several different accounts in
Deleuze, showing that he was not satisfied with
the solutions he gave.It does
help us understand what Deleuze thinks philosophy
should do—‘creating virtual events
(multiplicities) by extracting them from actual
processes and laying them out in a plane of
consistency’ (134). This is what the quasi causal
operator is supposed to do [so philosophers are
merely conscious element of it?An
interesting note 78, 152, discusses what Deleuze
means by a concept—not a matter of understanding
but referring to virtual multiplicities, itself a
concrete universal].This methodology distinguishes philosophy
from science—science is interested in the
actualized, while philosophy wants to ‘”extract
consistent events from the states of affairs”’,
quoting What is
Philosophy, 134.Science
and philosophy can therefore be seen as two
separate operations or as a single one, but
Deleuze does see that there are objective
movements which philosophers must grasp.Indeed,
philosophers must ‘become “the quasi cause of what
is produced within us, the Operator”’, quoting Logic of Sense,
134.This
leads to a connection between ontology and
epistemology—understanding means ‘correctly
grasping the objective distribution of the
singular and the ordinary defining a well posed
problem’.Spelling
out consistent problems implies that they have an
objective existence, somehow behind the
[empirical?] solutions, ‘just like virtual
multiplicities do not disappear behind actualized
individuals’ (135).
Chapter four. Virtuality and the
laws of physics.
In a flat ontology, we can
avoid the idea of reified or abstract totalities
such as institutions, or even nation states: these
all become concrete individuals operating at
different scales, produced by concrete historical
processes.As
with organisms and species, human individuals’
relations with these are best seen in terms of
parts and wholes.[See his book on social assemblages].Any
homogeneity arises from concrete historical
practices too.
The term ‘science’ can be
misleading if it refers to a totality, especially
one defined by an essence.Instead,
there are individual scientific fields, emerging
from populations as above [shades of Bourdieu!] .These
populations can include mathematical models and
techniques, laboratory phenomena, machines and
instruments, experimental skills, ‘theoretical
concepts and institutional practices’ (154).[Quite
like ANT here then?].These
fields are affected by historical processes, which
produce resemblance or separation, transfer of
techniques [‘translations’ for ANT].As a
result, ‘as a matter of empirical fact, science
displace a deep and characteristic disunity’
(154), although this is often concealed by
philosophical effort, which attempts to develop
essentialist and topological thinking.
In classical mechanics,
fundamental laws such as Newton’s, are often
viewed as general truths from which specific
propositions follow simply as a matter of logical
deduction.This
typically ignores the productive effect of
processes themselves, especially that of causal
connections.In the strict sense, this argues that
causal processes literally produce effects as a
mechanism.However,
much philosophy assumes that there are merely
constant regularities instead of active causes.This
followed from a notion that causality is
inherently linear, simple and separated: this made
causality look simple and law like.However
more complex forms of causality exist ‘nonlinear
and statistical causality, for instance’ and we
need some account of intensive production
processes to include them (155).
By framing causality as a
series of laws, linguistic statements have
dominated accounts of causality.The
specificity of mathematical models needs to be
restored, however, especially those which involve
attractors and state space as before.For
example, we can then see why some solutions to an
equation behave in a particular way—they ‘approach
an attractor’ (155).[More excellent examples of possibilities
missed by linguistic reduction follow].Minimising
causes, and rephrasing causes in linguistic terms
both lead to essentialist thinking.
According to Hacking, Newton’s
theory of gravitation did much to avoid the
specifics of causality by avoiding specifying the
actual mechanisms involved [and Hume is also
important].This
is encouraged by the general avoidance of
experimental procedures in the philosophy of
science—these are much more complex and productive
than is suggested by logical deductions from laws.[Not a
bad description of the relation between
educational theory and practice as well].
The ‘deductive – nomological
approach’ sees explanation simply as logical
arguments following from general laws, in the form
of propositions ‘declarative sentences… what two
sentences in different languages, expressing the
same state of affairs, have in common’ (156).Usually
propositions describe initial and other
conditions.If
the prediction generated matches behaviour, then
behaviour has been explained, not by tracing
specific causes, but by ‘a typological approach:
subsuming a particular case under a general
category’ (157). The whole process starts with
axioms—‘a few true statements of general
regularities’, from which we can deduce general
theorems, and then use observations in the
laboratory to check for truth or falsity.However,
any truth content has ‘already been contained in
the axioms’ (157). Axioms
are therefore like essences.
A new approach requires
mathematical models in explanation, including
statistical models of ‘raw data’.Practising
science involves drawing upon a population of
these models.Models are sometimes constructed by
combinations of fundamental laws and ‘various
force functions’.The link between these models and laws is
not a simple one of deduction, but a modelling
process, including ‘many judicious approximations
and idealisations, guided by prior achievement
serving as exemplars’ (158).In
ontological terms, the models involve emerge from
concrete historical processes, as an open set,
despite occasional closures [unlike the misleading
textbook view according to Prigogine, 158].Cartwright
begins her account by saying that axiomatic laws
are actually false—that is they achieve generality
by compromising accuracy [as when the laws of
physics assume frictionless wheels and so on, and
have to assume that all other things are equal].Additional
modifications have to be made to increase
accuracy, but that loses generality.Instead,
physics operates by deploying ‘causal models
tailored to specific situations’ (159).Truth
content does not lie entirely with fundamental
laws, since models are not simply deduced from
them.Indeed,
causal models themselves vary in terms of whether
they focus on specific situations or general laws,
with the former much more frequent.
There are also statistical
models of data, never raw data as in positivism.Statistical
models were used originally to calculate
measurement errors, for example.Laboratory
testing involves ‘a complex practice of data
gathering, involving not passive observations but
active causal interventions’ (160) [good examples
in Latour].General
laws attempt to unify these models, and again this
effort is the result of an historical process
featuring concrete individuals such as Euler or
Hamilton (160).In the process of unification, however, the
notion of singularities became important, rather
than abstract forces [discussed 160f.One idea
from Hamilton is the ‘minimum principle’ which
apparently unifies a number of accounts—light
travels along the path that minimises distance,
for example.This was originally connected with
theological notions that God works with an economy
of effort.] Eventually, a ‘calculus of variations’
mathematicised this principle, ‘the first
technology ever to deal directly with
singularities’ (161).
The calculus involves trying to
pin down the actual processes, among all the
possibilities, that have changed physical
systems—‘for example a set of possible paths which
a light ray might follow’.Possibilities
can then be sorted into ordinary and singular
cases, and ‘the results of experiments show that
the singular cases (a minimum or a maximum) are
the ones that are in fact actualised’ (161).The
singularities are not proper attractors, but act
in a similar way. Attractors define the long term
state of a system, an equilibrium.This
helped
Euler replace the awkward Aristotleian
combination of final and efficient causes with
just final causes, which led to the idea
of more unified conceptions. [But doesn't this
lose detail? Worth it to break with Aristotle?].(
For Deleuze, final causes have to avoid any
teleological connotations, and so can only be
quasi causes).
This abstract approach avoiding
details meant that classical mechanics could
become some unifying approach—it had discovered a
mechanism-independent process.Other
more detailed models were still required however
to replace the details.Nevertheless,
this combination could not be reduced to
linguistic general laws, and mathematical models
proliferated.Again, some related to the actual world,
but others to the virtual world ‘by virtue of
being well posed problems’ (162).For
Deleuze, a problem is well posed if it gets right
‘the distribution of the singular and the
ordinary, the important and the unimportant, the
relevant and the irrelevant’ (162).
This leads to a problematic
approach [one focused on problems] to replace
fundamental laws and axioms.Nevertheless
the search for a single law which everything
follows still persists.Again it
will be necessary to pursue non linguistic and
more specific explanations of things like the
distribution of the important and the unimportant.This can
begin by considering explanatory problems.Traditional
explanations downplay the productive mechanisms
involved, but remain at the level of explaining
regularities instead of looking at why specific
processes occur.Why questions will require specific models
that exceeds the linguistic formulation.
One example arises from a
philosopher called Garfinkel [!] who points out
that requests for explanations can imply different
‘contrast spaces’ [implied alternatives, which may
often not be shared in a conversation.The
example is the question why did you rob a bank.The
robber replied because that’s where the money is].Answers
implying different contrasts can be true, but
still irrelevant—relevance and validity implies a
specific contrast, and this is not apparent in
linguistic formulations.Instead,
different contrasts should be seen as
possibilities, or even state spaces (165).Using
the vocabulary of state spaces and mathematical
models offers a more precise account than the
usual linguistic formulations.Possibilities
will then depend on the distribution of
singularities and their basins of attraction, or
in Garfinkel’s terms ‘“basins of irrelevant
differences, separated by ridge lines of critical
points”’ (166).
In Deleuze’s terms, problems
may be rendered false [irrelevant?] because they
are either under- or overdetermined—vaguely
defined so that it is impossible to see if an
event confirms one alternative rather than the
other, or too sharply defined, so that only
specific events will help decide alternatives
[wouldn’t this be a fair test, the only kind of
help us falsify?].The examples turn on trying to explain
changes in populations of predator and prey.One
explanation might be overdetermined in the sense
that it would require us to account for each
individual rabbit being eaten by each individual
fox. [So
what would be an under determined one?One that
considered irrelevant relations between predator
and prey?] What we need instead is an explanation
operating at a suitable level of specificity,
which would produce a stable relation.This
might vary according to the scale of which we are
operating—population level, or individuals.The
trick will be to study the causal capacities at
each level.
For Deleuzians, we also need to
trace quasi-causes.In the example of a population these would
refer to the long term duration of the cycle, ‘a
mechanism-independent aspect which still demands
explanation’ (168).In the case of biology, this has led to the
search for attractors governing stable cycles.However,
it is not just biological mechanisms which can be
studied like this.Convection flows and turbulence also pose
the problem of a suitable level to study the
process – in this case, descriptions at the
molecular level are irrelevant, ‘many collision
histories be incompatible with the same macro
level effect’ (168).Macro factors such as temperature gradients
are required, together with quasi-causal factors
such as bifurcations and attractors.
[There is a useful summary of
the argument so far 168-9].The only
thing added is that contrast spaces can have the
complex structure of a cascade of bifurcations.In this
way, ‘a problem may gradually specify itself as
the different contrast spaces’ it contains reveal
themselves, one bifurcation at a time’ (169).A
connection with ontology is starting to appear:
the relation between problems and solutions ‘is
the epistemological counterpart of the ontological
relation between the virtual and the actual.Explanatory
problems would be the counterpart of virtual
multiplicities’ (169).Actual
explanations would be individuated.For
Deleuze, this means that actual organs in an
organism can be seen as solutions to a problem
[and presumably this runs the other way round as
well, that puzzling out an explanation means
recovering the ontological issues involved?].In
Delanda’s example,
soap bubbles and salt crystals are solutions
achieved by the molecules involved to the problem
of attaining minimal points of energy: ‘It is as
if an ontological problem, whose conditions are
defined by a unique singularity, “explicated”
itself as it gave rise to a variety of geometric
solutions’ (169-70).
So problems posed by humans
(epistemology) are intimately related to ‘self
posed virtual problems’ (ontology), and this is
‘characteristic of Deleuze’ (170).The two
are ‘isomorphic’.Experimenters who individuate problems’ in
the laboratory are acting isomorphically with
intensive processes of individuation in reality.This is
counter to the usual view of realism, where a
description is produced intending to mirror
reality by developing a relation of similarity.Naturally,
resemblance or similarity cannot be accepted by
Deleuze [so he weasels with the idea of
isomorphism?].Philosophers by contrast ‘must become
isomorphic with the quasi causal operator,
extracting problems from law – expressing
propositions and meshing the problems together to
endow them that minimum of autonomy which ensures
their irreducibility to their solutions’ (170).These
isomorphic processes go on the experimental and
the theoretical level.
It is important to realise that
the material itself in the laboratory behaves, as
much as the mathematical models—for example matter
can self organise and self assemble, and this is
lost by a focus on linear causality.There
are still ‘nonlinear and problematic’ relations
between materials, experimental situations and
causal models even after simplifying causals to
linear ones (170). Laboratories
produce heterogeneous assemblages which are ‘
isomorphic with real intensive individuation
processes’ (171).Theoretical problems also correspond to
Deleuze’s analysis of state space, involving
trajectories and singularities: ‘ the
singularities defining a problem in physics are
isomorphic with those defining the conditions of a
virtual multiplicity’ (171).We only
see these links once we focus on problems and not
solutions as prior.
The unwarranted emphasis on
solutions is found in classical physics and its
residual view of matter as passive, a mere
receptacle of forms, and where all the activity is
produced by the experimenter.Delanda
applies this to social constructivists as well
(171).By
contrast, a flat ontology assumes that properties
emerge from causal interactions rather than being
simply a sum.Stable emergent properties and new
causalities are responsible for larger scale
individuals.
The usual approach isolates and
separates causes in order to study them, simply by
ignoring complications.‘As
Bunge notes, this procedure may be ontologically
objectionable but is in many cases
methodologically indispensable’ (172).[This is
scientism, where the pursuit of a true method will
deliver true results]
It is not the initial
simplification into objectivity of causes the
problems, but its subsequent reification into a
principle.
Classical notions of causality
also include assumptions of ‘uniqueness,
necessity, unidirectionality and proportionality’
(172).Emphasizing
these has produced an impoverished notion of
materiality, ‘clockwork world views’.
Uniqueness.In
practice, several different causes can produce the
same effect, and the same cause can produce
different effects—heat can be produced by several
causes, and hormones like auxin can produce both
growth and inhibition of growth in plants,
depending on where it is located.An
additive conception of cause would not be able to
detect multiple causes, and it would not be able
to account for different effects do not add up.
Necessity.A better
conception is to talk about enhanced probability,
as when smoking enhances the possibility of
developing cancer.Again effects are distributed
probabilistically, and do not just add up for the
whole population.
Unidirectionality and
proportionality.In the
classic conception, effects do not react back on
causes, but we know that every action involves a
reaction.If
this is a large enough reaction, proportionality
also fails [since it assumes that ‘small causes
always produce small effects’ (173)].In
reality, a variety of options are available,
including ones where effects amplify causes
(positive feedback).Given this variety of causes and effects
and interactions, simple addition becomes rather
unlikely.Criticism
of the notion of externality is also implied
[another Aristotelian concepts related to
efficient cause, apparently].External
causes are supposed to affect passive targets,
producing all the affects, but this breaks down if
the target ceases to be passive and can react back
as above.
In a flat ontology, linear
causes are special cases, and most causal
relations are statistical probabilistic.There
are no passive receptacles, since internal
structures play a part in determining the effect,
for example if they feature alternative
attractors.As
a result, the domesticating impact of linear
causality ceases to apply, and the idea of
problems emerges again: it is no longer enough to
specify a single external cause or additive
effects.Objects
can self organise and self assemble, producing
emergent and unpredictable effects from intensive
processes.[The
example from 1000 plateaus is the one about
artisans needing to work with wooden material
rather than attempting to totally dominate it].
In a similar way, experimenting
in physics also has a productivity of its own.It
involves individuating stable laboratory
phenomena, often involving novel products.These
phenomena can relate to several theories, and
persist even if paradigms change, or remain in a
problematic state lacking a full explanation.Individual
entities also have to be produced in this way,
‘connecting operations to a materiality instead of
deducing the form of the entities in question from
a theoretical law’ (176).So
measuring things like the mass of electrons or
their charge is a matter of ‘intervening causally
in the world’ (176).Once individuated, physicists learned from
electrons by making them part of heterogeneous
assemblages and observing their affects.It was
not until the sort of practice developed, that
electrons were seen to be real.
This heterogeneous assemblage
can include machines, models, phenomena like
electrons, and the experimentalists themselves.These
different components are meshed together in a
complex process, and models are refined and skills
developed.‘The
whole’ [scientific knowledge itself] gets
stabilised by this assemblage as well.
Again, this is an
epistemological counterpart of intensive processes
in ontology [experimenting is seen as an intensive
process, gradually defining the problem by
considering what makes a difference, what is
relevant and so on].This is an emergent process, to be compared
with the immediate intuition of an essence.The
process delivers extensive products too, like
individual bits of data particular solutions.Thus,
for Deleuze, [Difference and Repetition]
‘”Learning is the appropriate name for the
subjective acts carried out when one is confronted
with the objectivity of a problem…whereas
knowledge designates only the generality of
concepts or the calm possession of a rule enabling
solutions”’ (177).
So problems can be subordinated
to solutions by simplifying complexity into a
homogenous linear system, or studying low
intensities or equilibria.This
limits the capacity of the material to form new
assemblages, and is acceptable [as long as it is
not reified].Subordination also arises when processes
such as experimental processes are neglected in
favour of formalised statements.These
are abstracted from practice, and they only become
important by inserting them into some theoretical
framework.
So far we have been discussing
models that interact with the empirical.Those
which interact with the virtual tend to be much
simpler, as in the case of formulating general
laws.To
analyse these models, requires Deleuze on state
space.State
space analyses cannot be used for causal analysis
because they are too simple [‘typically valid only
for models with a few degrees of freedom’ (178)].They do
not refer specifically to causal processes either,
although in some cases, it looks as if the
successive states can be linked causally, with the
initial state as the first cause, and the final
state as the effect.But this is ‘a mathematical expression
positivist reduction of productive or genetic
aspects of causes to a process of uniform
succession’ (179).Instead, each state is produced from the
same determinants, rather than one causing the
next one.
There is information about
quasi-causal relations, however.First of
all, you have to see that vectors generate the
series of states in a trajectory, by producing
singularities.[bits I do not understand here, apparently,
singularities define the independent conditions of
the problem, while the vector field produces
solution curves—179.The quote from Difference and Repetition is
hardly helpful, and involve some notion of
complete determination as opposed to the specific
determination of singularities].
For conventional analytic
philosophy, trajectories are used as
predictions—measured values are transformed into a
curve which is then projected.Laboratory
systems can be produced with similar initial
conditions, and thus similar projections.A
perfect match means that the model is true to the
system, but normally, we settle for approximate
truths.However,
the whole argument is based on the geometric
similarity between curves, but there may be deeper
topological invariants producing an isomorphic
relation between model and reality—the assumption
is that the model identifies the singularities of
motion correctly.The implication is that this happens when
the model and the system are ‘coactualizations of
the same virtual multiplicity’, at least over a
given range (181).So the geometric resemblance must be
explained by these common topological properties
instead of being left is sufficient—that would be
a proper explanation of similarity.
In analytic philosophy, laws
are related to trajectories as generals are to
particulars [that is a logical relation again?]
Particular trajectories reflect different initial
conditions.For
Deleuze, however it is the distribution of
singularities that matters—the particular state of
trajectories is irrelevant because many different
ones can ‘end up in the same place, the
attractor’.The
distribution of singularities determines which
initial conditions are relevant in the first
place.The
generality of the law is really produced by ‘the
universality of virtual multiplicities of which
both model and real system are divergent
actualizations’ (181).
This inability to focus on
problems rather than solutions has a long history,
and is associated with linguistic formations.However,
there is a more modern and specific mathematical
process too, occurring whenever problems are
judged by their solvability alone.There
have been breakthroughs too, where solvability was
itself seen as a result of a well posed problem
[the examples are discussed 182 F.A gloss
follows].
Fail to break equations have
particular and general solutions, the first
involving the substitution of an algebraic term
with a number, the second describing ‘the global
pattern for particular solutions’, usually as
another equation.[So
(
x squared) plus (3x) minus 4 =0) can
be solved in the usual way to give x= 1, but
there is also a general form [I'll have to spell it because formulae
will not copy over well to HTML] x = the square
root of (A squared over 2) plus B minus (A
over 2), where A and B take the place of 3 and 4
in the equation above. [ I think I can see
this].For
mathematicians, such a general solution
indicates a well posed problem.However,
this worked to the fourth power of X, but not
for values above.The solution only emerged by thinking of
equations as occupying particular groups,
following the development of group theory [more
detail, 183]. The point is not to take general
solvability as the criterion of a well posed
problem, but to invert the process, so that
general solvability is itself explained in more
universal terms from group theory.These
universal terms involve permutations of
different solutions, providing groups of
equations, and this helps understand what is not
known.For
Deleuze, it is important to include what is not
known in the very objectivity of the problem.This
particular approach also produced increasingly
specific sub groups of equations, so that
‘problem itself becomes progressively better
specified’ (184).
In another
example, space and time were rethought.In
classical conceptions, changing spaces [choosing
different locations for the laboratory] made no
difference to physical processes.The
same goes for time.In
this sense, absolute values for space and time
are indifferent to the operation of the law,
that is they have become irrelevant.This
led to the theory of relativity (183).
This is
led to a more general approach to mathematics,
moving away from trial and error processes to
achieve solvability, and revolutionary step to
link problems and solutions in a new way.The
same idea can be seen in modern solutions to the
problem of specifying groups of differential
equations—a computer generates a population of
solutions so that the general pattern can be
discovered.Classical physics had to simplify because
it could not generate this kind of solution, and
so it tended to neglect models that produced an
exact solutions.This produces an additional bias ‘toward
a clockwork picture of reality’, and the
solvable equations happen to be the linear ones
(185).It
so happens that linear equations can be solved
economically, because of ‘the superposition
principle, which states that given two different
solutions of a linear equation, their sum is
also a valid solution’ (185).This
was a considerable bonus at the time.Linear
equations can apply to complex systems, as long
as the relevant variables operate at lower
levels of intensity.For
all these reasons, linear approaches tended to
dominate.
A similar
approach was followed by Poincare with
differential equations.The
problem was to model the interactions of three
solar system bodies, not just by looking at the
empirical singularities, but trying to generate
a
whole picture of their existence and
distribution, or to generate a space for all
solutions.He thus defined the problem itself in
spatial terms.
These
models clearly conforms to Deleuze’s idea of the
virtual as well.Hiding problems behind immediate
solutions also hides the virtual and ‘promotes
the illusion that the actual world is all that
must be explained’ (187) [another example of
positivism really?Also a problem for activists?].The
hope was that a super law of everything would
emerge from this study.However,
with multiple attractors and non linear models,
there will always be emergence, surprise,
novelty, no end to problems.The
lack of surprise in classical physics depends on
effects already being present in causes, with
only quantitative novelties, a world without
productive history.
The
alternative ontology is fully historical.Each
individual is produced by ‘a definite historical
process of individuation’, and, through
interaction, each individual can itself drive
historical and causal processes.Even
the quasi causal is to list oracle, but with a
different form of temporality.So,
‘in the Deleuze Ian ontology there exist two
histories, one actual one virtual, having
complex interactions with one another’ (188).There
is an historical process of actual events, and
one of ideal events ‘defining an objective realm
of virtual problems of which each actualised
individual is but a specific solution’ (188)
[Note that
there is also a very useful appendix explaining
the connection between terms used in this
reconstruction and Deleuze’s own terms, written
in the usual clear ways.This
clears up some mysteries at last—for me, for
example, supporting my hunch that the body
without organs was some sort of virtual
continuum, and that the desiring machines are
really individuating processes.There
is also a really clear discussion of the
wretched section on syntheses in AntiOedipus.Delanda!What a
hero!]