Notes on: Schrader, A. (2012)
Haunted Measurements: Demonic Work and Time in
Experimentation. Differences. A
Journal of Feminist Cultural Studies
23(3) 119--60
Dave Harris
[Very long --it took me 5 sessions over 5 days.
There are deep assumptions about time buried in
scientific culture . Gripping account of the
controversies over Maxwell's Demon in
thermodynamics and the various issues raised for
measurement and observer effects, a clear
prelude and parallel to Böhr --and still going
in the 1990s. Longer discussion of 'quantum
erasure' than you find in Barad, which still
structures the argument -- but more difficult to
grasp]
The subjective experience of time is also
important for scientists. They are also well
aware of the tendency of human labour to
disappear in the final product — like a research
proposal. Feminist science studies has
questioned a number of binaries and dualisms,
but paid little attention to time. Rethinking
time is crucial, though, to changing
'oppositional hierarchies' [as she argued in the
Pfiesteria piece]. Scientific representations
omit human subjectivity and history, and so they
imply 'an ahistorical nature' (120). Latour has
apparently argued this, and how reducing nature
to a series of objects leaves science as a kind
of miracle, but again without human labour. So
objective notions of time are 'woven into the
very fabric of the theories'. It is supported by
the notion of scientific progress. However, it
is not entirely successful [hence the view that
we have never been modern] and we are aware of
'natural – cultural hybrids' (121, so much so
that the idea of progress must be changed — it
now heads towards complexity rather than
modernist simplicity.
The quantum mechanics notion of entanglement
raises the whole issue of observers and their
relation to the universe, but this issue has a
long history, for example in thermodynamics.
Stengers and Prigogine have argued that some
direction for time has to be presupposed for any
experiment, but what if the observer is no
longer external to this system — we enter the
problems of '"metaphysics of
representationalism"' for Barad, and '"
metaphysics of presence"' for Derrida. For him,
time is a series of moving presences, but only
the present itself seems real or actual, and
time can be understood in terms of becoming. It
is sometimes thought of as irreversible [in
classical mechanics as we shall see]. It leaves
us with a paradox that the transition from
present to past is 'both ontologically real and
merely an appearance' (122). Sorting out the
contributions of objective event and subjective
intervention is the problem.
Physicists have long discussed the problems in
terms of the activity of Demons, often some
playful superhuman being who can see everything
that's happening. For LaPlace, an all seeing
Demon would confirm Newtonian mechanics — a
'spectator theory of knowledge'. Classical
mechanics assumes that if time could be
reversed, the laws would still hold as we moved
back to initial conditions.
Quantum mechanics questions these assumptions,
initially through examining the influence of
observers, then through the influence of
measurement itself which provokes 'an
irreversible transition from… preparations… to
permanent marks', (123) or from potential to
actual.
Thermodynamics is 'in between', still affected
by both classical and quantum mechanics. It was
founded on a paradox — that heat '(a microscopic
notion related to the kinetic energies of
molecules)' can turn into work, a macroscopic
type of energy. In particular the Second Law
'explicitly prescribes an "arrow of time".
Maxwell's Demon was constructed to imagine what
would happen if humans could acquire
knowledge about the microscopic world without
leaving a distorting macroscopic trace. Schrader
wants to transform Demons into ghosts, which can
'reconfigure the very being of time'. There is a
"virtual space of spectrality" for Derrida, and
the notion of inheritance does away with a given
past and replaces it with a task [to honour
inheritance?] . However, 'we cannot just choose
our ghosts, that "to be haunted is to be tied to
historical and social effects" (Gordon 190)'
(124) [that's encouraging!].
The discussion of experiments, including thought
experiments in physics, shows the potential of
natural sciences to make a difference [in the
conception of time], as Kirby
saw with her 'Derridean provocation'. It is
central to feminist approaches. We need to
deconstruct the arrow of time and arrive instead
at a "ghostly" conception. If we can do this we
can radically challenge dominant terms and
concepts, not just modify them [a feminist
project really?]. Rejecting a concept of
anticipation based on irreversibility, also
leaves open 'spacetime for ethical concerns', in
the form of new material-discursive practice.
Maxwell's concept was extended by Szilard, who
connected the debate with information theory.
Feminists see this as critical in reading living
organisms as a matter of genetic code, and in
developing the possibility of artificial
lifeforms [both bad]. But we can reread it as an
example of how 'a different history is always
possible, at any time, here and now' (125).
Maxwell's Demon held open the possibility for
subjective inputs. The problem was how human
knowledge affected the efficiency of heat
engines. Maxwell wanted to deny that the Second
Law was universal. It assumes a clear
irreversible process where heat flows from
warmer to cooler systems. Earlier means hotter.
More technically, 'entropy always increases in
isolated systems'. The term entropy itself has
an interesting history, coined [for practical
reasons] in 1865, to grasp the
'"transformation–content" of work. The technical
problem was how to improve the efficiency of the
engines, and the finding was that any
transformation of temperature into work was
always accompanied by losses, some energy was
unavailable for useful work [produced by a flow
of heat] — entropy.
Early formulations applied particularly to
cyclic processes, where, for example, heat was
absorbed from a reservoir [I thought of early
steam engines where the reservoir of cool water
used to condense the steam gradually gets hotter
itself, but this is another implication]. Heat
can never be completely transformed into work,
so the production of work implies a [potentially
irreversibly exhaustible] reservoir of
resources, although the cycle of work itself can
be reversed. A later physicist argued that this
indicated a cosmically valid law. Nature was
seen as a reservoir of energy, but one that
could be exhausted until "heat death". No
reservoir would eventually mean that no work
could be extracted. This was generalised to
become a universal 'tendency towards homogeneity
and death'. However, there was a problem — was
the heat lost in entropy just lost to human
beings, wasted, or actually annihilated? The
problem shows the clear links between notions of
time and 'what was considered useful' (127).
Maxwell introduced probability theory into
thermodynamics, increasing a possible role for
subjective experience. He also saw heat and work
as 'fundamentally distinct forms of energy'. The
development of the kinetic theory saw heat as a
matter of interaction of molecules, proportional
to kinetic energy, something 'radically
different from our tangible macroscopic world'.
Those processes [were still understood in
classical terms but?] could only be grasped
through probabilities, never directly observed,
so human knowledge became an important factor,
in effect underpinning the Second Law — we could
see the increasing entropy while claiming it to
be an objective property. However, this leaves
'a "degree of ignorance"'. Nevertheless,
[probabilistic] agreement between scientists
could produce a kind of objectivity, at least in
the sense that they did '"not depend on
anybody's personality"'. However, the
irreversibility of entropy was now dependent on
human knowledge, and it was still paradoxical —
the microscopic processes should still be fully
reversible, in the Newtonian sense, but in our
experience, heat was lost.
Maxwell began by thinking what might happen if
we could manipulate individual molecules as if
they were macroscopic objects. We could then
possibly stop entropy. Maxwell had to conceive
of a Demon to do this, given the limits of human
beings — in other words he assumed [in 1871]
unlimited knowledge to explain the paradox. The
Demons could open and shut valves [frictionless
ones], intervening in physical processes on the
basis of information. The idea would be to
determine the speed of the gas molecules and
then sort them [not average them] into two
compartments, allowing the faster ones to pass
through. This would produce order out of
disorder, and reverse the arrow of time. The
assumption was no work would be involved. The
Second Law would no longer be a law but a
probability statement. There are wider
implications though for the very boundary
between microscopic nature and macroscopic human
associated processes, including irreversible
ones.
The Demon was investigated much further. Some
actually attempted to see the molecules using
light or magnetism. Problems emerged, such as
the apparently crucial role of the intelligence
of the Demon. A lot of literature was produced.
The Demon is now seen as ambiguous — horribly
human in its desire to overcome limits of
knowledge, but also material, a factor in
producing knowledge [by actually changing
things]. There is also a problem of Brownian
motion — the Demon would be subject to it and
would thus itself heat up and eventually be
unable to operate — a limit for all '"automatic
devices" producing work.
However if human beings could be '"continuously
and exactly informed of the existing state of
nature"' (131), they would not have to expend
work because there would be no need to actually
direct molecules but just use their senses —
although again this might involve a dissipation
of energy. The problem was postponed [in 1914]
on the grounds that we did not know how
cognitive processes actually worked.
Enter Szilard, who focused on what humans did
when they measured things, and whether
intervention could be done without generating
additional work. Thermal fluctuations might be
able to be exploited in principle to avoid loss
of energy. However the issue is whether this is
possible continuously. He proposed an apparatus,
a machine which could lift weight without using
a reservoir of heat, at least in the long run.
Human intervention would consist not of
manipulation of thermal fluctuations but rather
the ability to measure them. However,
measurements themselves produce entropy
especially if it necessitates memory [which will
require work].
In more detail, Szilard proposed a 'one molecule
heat engine' (133). A gas made of one molecule
is confined in a cylinder surrounded by a
constant temperature [so an ideal condition?]
heat bath. If a partition is placed in the
cylinder which confines the molecule to one
side, we will get gas pressure on the partition
which might be used to do work. The Demon would
decide which half of the cylinder contains the
molecule and would then record the result. If we
replace the partition with a piston and use our
recorded results to couple it to a weight, we
will be able to get the piston to lift the
weight [because we will know which way the
piston will move as a result of gas pressure].
The molecule has transferred its heat energy to
the workload. As it loses energy it replaces it
from the surrounding bath. As the piston is
pushed all the way to one end, 'the gas once
again occupies the entire volume; its entropy
has not changed', although the heat bath has
lower [?] entropy. Nevertheless, we have
extracted work from heat alone [from the flow of
heat surely?] The Demon will continue to measure
and to guide useful work so that we have 'a
perfect perpetual motion machine', which
continually extracts work from heat [I still
don't really get it — Schrader goes on to say
this happens without a temperature gradient, but
surely there is one between the bath and the
cylinder? And overall, the system has surely
lost heat?] [There is a diagram and further
explanation on page 134]. There is still a
problem because all the Demon does is to
measure, leaving who actually inserts the piston
as unclear — his measurements are crucial in
order to correctly connect the weight and the
piston, and the engine itself provides 'the
memory of the binary decision process'. So the
human experimenter seems to have been
eliminated.
However, Szilard agrees that the steps involved
'belong to one and the same act of measurement',
including the human bits where cylinders are
separated and partition is inserted.
Translations have rendered this as a sequence
rather than the same act of measurement
happening at the same time. For Szilard,
however, any measurement shows a '"memory
faculty"' — measurement involves 'coupling [of
values provided by instruments with positions of
objects] accompanied by memory'. In this case,
the movement of the piston embodies the memory
of the location of the molecule, and Szilard
assumes that the apparatus is then decoupled
[and the memory lost?].
There is still a problem deciding when
measurement is actually completed, and
decoupling takes place, so that the values
measured enter memory[?] If we are using the
movement of the piston to measure the location
of the molecule, this means we can do the
measurement only when the piston starts to move
[in her philosophical way, this means 'there is
no time before the piston moves' (136)]. It is
no longer clear when the piston is actually
coupled and so it is equally unclear what the
period of measurement is [I think she's saying].
'The very notion of memory assumes the past of
the coupling, but that which was coupled only
exists [only reveals a complete effect?] upon
decoupling' (137).
For Szilard, we complete the measurement when we
can't draw any conclusions from the initial
values we observe [maybe — two parameters are
decoupled is how it is put], and where there is
only a memory. But this memory can both precede
and follow a coupling if the engine is run
continuously, so humans [who store the memory]
must be intervening constantly. Further, Szilard
does not see any difference between the first
coupling and subsequent ones [but there probably
is one, because the subsequent ones are based on
memory?]
In general, humans do contribute to measurement,
by setting up the apparatus — in this case
inserting the partition. This is not a
preparation for a measurement, but an
intervention that actually establishes what is
to be measured. The Demon is situated within
[part of] the system. There is a flow of time
involved in design, with no precise notion of
how it all started, or when exactly the present
observation becomes a memory of the past.A
further development had an ironic result. If the
Demon can see the molecules, this is
'information gathering with a light signal' and
must lead to entropy increase. Generalising, any
information acquisition must be. It fits with
Szilard by identifying his memory with
information. One implication was that
information could be '"divorced from human
intelligence"'. Later still, another information
theorist suggested that there might be
computational processes that would not produce
entropy. Szilard got it wrong because there was
a true entropy cost not so much in measurement,
but in the erasure of memory. 'Finite
information-processing Demons had to regularly
clear their memory registers, it was argued, and
that takes work' (138). Controversy still
persisted as to whether entropy is produced
during information acquisition or during this
process of memory erasure necessary in any
cyclic process. We need to remember that
measurement was always accompanied by memory for
Szilard. 'In plain language', taking a piston
and inserting it into a cylinder does not take
work, but removing it and then reinserting it
does — the latter has a history [which has to be
taken into account? Overcome?].
As information theory developed, Szilard's
measurement became the detection of the value,
and the preparation of an experiment was no
longer considered to be measurement. Information
retrieval and storage were also seen as
independent and separate. This was appropriate
for information theory, but the meaning of
Szilard's experiment was changed and the central
problem displaced into trying to establish
exactly which bit of the measurement process is
irreversible. Incidentally, Szilard
'quantitatively specified the minimum amount of
entropy the demonic measurement work would
produce, which became the measure now known as
"bits" in computer science' (139). [Apparently,
this meant that] where entropy is actually
produced in accordance with the Second Law
'remains ambiguous'.
The earlier accounts assumed that
irreversibility arose only once interactions
left a trace in memory. This became a problem of
drawing a boundary between human interventions
and the measurement device that was already in
the system. In effect, the Demon is not part of
the thermodynamic system, nor properly external
to it. Measurement can still record or create,
or do both. We are left with undecidability —
what is in the system and what is outside, what
counts as a reading and writing of memory, and
what a memory trace actually is. Developing this
as an information theory problem diverts from
the problem of human intervention. The
conventional arrow of time works still if
coupling happens before measurement and memory
only after it. The externality of an observer is
also necessary. The reversible system cannot be
closed to the macroscopic world, because
external influences have to penetrate it
including 'an additional measurement on the
Demon's memory' (140). As soon as the Demon
relates to the external observer, we find all
the usual problems about subjectivity and
objectivity, matter and mind and so on. The
irreversibility of the Second law 'becomes a
consequence of the fact that the Observer does
not measure but merely observes'. [Very puzzling
stuff].
We can now come to a similar measurement problem
in quantum mechanics. Indeed the early quantum
theorist Von Neumann began with Szilard's Demon,
hoping to develop a quantum version of entropy
where the quantum state will also be
irreversible in a measurement. Early quantum
mechanics assumed irreversibility, conventional
relations between past and future, preparation
and test — entanglement between them produces
the actual measurement.
However mathematical formalism developed which
predicted probabilities for measurements but did
not 'account for the measurement itself or its
specific outcome' (141). For Schrader,
'measurement implies a discontinuous and
irreversible "selection" of a specific value'.
When applied to quantum processes, we find a
reversible bit before the measurement, the
irreversible measurement and therefore a strange
boundary between them. The transition from
reversible preparation providing a set of
possibilities to irreversible marks on bodies is
'the central quantum mystery'. It is sometimes
called the collapse of the wave function upon
measurement [the quantum state collapses into
the familiar macro one] , where probabilities
become a definite value [waves produce
measurable bands? Particles go through various
stages and actually produce marks on recording
film?].
There have been a 'large variety' of attempts to
explain collapse. In general, two options exist
— one sees measurement as a physical disturbance
that interrupts correlation between quantum
systems; the other sees human intervention as
important, a matter of selection, sometimes
associated with consciousness '(and by
extension, to God as guarantor of determinism
enabling the apparent human "choice")'.
The wave particle duality is an example
[summarised 141 – 2]. Matter also has a wavelike
character seen by experiments with electrons
rather than photons. As Barad
has noted, we get an interference pattern even
if electrons are sent through a double slit one
at a time, implying that electrons remember the
path of earlier ones. Attempts to see which slit
is involved brings the disappearance of the
interference pattern. Adding a which-path
detector produces particle behaviour. All this
can be explained in terms of Böhr and
complementarity — if we measure the position of
electrons one way we define them as
particles, and there can be no interference. If
there is no which-path detector, the apparatus
'defines the electrons as waves' with
interference. We are not talking about intrinsic
wave or particle properties[ so Barad has to go
beyond Böhr here] 'Rather, a measurement
defines these properties that simply do not
exist independently of the measurement
apparatus' (142).
This in turn relied upon Heisenberg's
uncertainty relation — the position and momentum
of a particle cannot be determined
['arbitrarily, exactly'] at the same time. A
measurement with a positron, like a photon
'necessarily involves a momentum transfer that
would smear out the interference pattern', so
measurements disturbed the path of the particle
physically. This means it is impossible to know
how things really are.
Later experiments tried to rule out this
physical disturbance [I think this is the
ingenious erasure experiment described
best in this Barad].
They used atoms instead of electrons and
developed a which-path detector that would not
influence the movement of atoms. A laser beam
excites the electron within the atoms and this
means a photon is emitted when travelling
through one of the two cavities [placed in front
of the slits]. This helps us trace the path of
the atom without disturbing its movement. The
interference pattern still disappeared, implying
that it is the measuring cavity itself which is
responsible for the disappearance.
For Barad it shows the very 'experimental
possibility of distinguishing between the paths
of the atoms' (144 — lots of diagrams on the
intervening pages] which destroys the
interference pattern, and removing the
distinguishing apparatus produces another
interference pattern. This is the quantum
erasure effect, where it looks like we can erase
the which-path information. Further work
involved adding another photodetector between
the two cavities, so that the apparatus 'cannot
discern from which cavity the detected photon
originates' [and apparently, the detector has an
equal 50% chance of receiving a photon from each
path]. The hits on the detector are correlated
with the marks on the screen to produce 'another
interference pattern'. What we have seen here is
the effects of 'measurement upon measurement
upon measurement, in which the measurement
apparatus becomes the object of measurement of
another', with each measurement producing a new
correlation between atoms and photons. For the
physicists concerned [Scully, Englert and
Walther], this explains the disappearance and
reappearance of interference patterns, but
exactly how these correlations are produced
remains a question, especially whether
subsequent measurements 'somehow undo the
previous one', or whether it is a new
interference pattern. The first explanation
suggests that measurements are reversible, and
the second that correlations are irreversibly
extended [and added]. This still preserves the
conventional arrow of time, and it is that that
we will have to critique.
Some physicists [Greenberger and YaSin]
have introduced the notion of '"haunted
measurement"' (145). If correlations just add up
irreversibly, endless regress threatens, '"how
do we know that no [measurement] will come along
at some future time"' to rearrange existing
findings? [This must be a problem for Barads'
account too?]. Those advocates insist that there
must be genuine destruction of measurements —
'correlations must be physically destroyed'
(146), and that measurements that leave a
macroscopic trace must retain 'a "latent order"'
so that the measurements can actually be undone
in the future. Their quantum erasure experiment,
using similar kit, shows that subsequent
measures can indeed make a measurement disappear
— so it is a 'macroscopic "ghost"' 'that
disappears if one does not look at it'.
Measurement is therefore haunted. 'Looking at'
here does not mean an observer, but 'physical
interaction that detects the evidence of the
passage directly and destroys the coherence
[pattern produced before the which-path
detectors] for good'. It is again down to
photons used in observation hitting objects and
therefore changing them, making measurement
reversible again.
So we can have both 'true' measurements, which
are irreversible, and haunted ones which are
reversible. How can we distinguish between them?
Until we do, the very idea of quantum
measurement must remain '"subjective"' and also
historically contexted, awaiting more
sophisticated measurement in the future.
Heisenberg's uncertainty principle is also
upheld.
But Scully et al could extend the notion of
haunted measurements to include an effect by
experimenters here and now, for example in
manipulating the which-path detectors, even
after the atoms of pass through them, meaning
that there is a choice between which-way
information and quantum erasure '"at any time"'
(147), a '"delayed choice" mode of the quantum
erasure effect'. Schrader finds this misleading
and prefers to replace 'choice' by 'material
"definition"'.
Measurements do seem to be affected ['haunted']
by human choice. For Scully et al human
measurements do not destroy quantum coherence,
but create a series of correlations that produce
the interference patterns, and the experimenter
can choose which one to use. The actual path of
the atom is not affected: it is future
possibilities that are the basis of choice. Even
so, the choice must remain potential [apparently
an actual measurement implies that the wave
function permanently or irreversibly physically
collapses upon measurement, which they reject].
In more familiar terms, 'it only appears to us
as if the atom took a particular path'. They go
on to say that all physics is about
'"as–if–realities"', and get quite
phenomenological about what physics deals with.
The measurement effects are 'mere "mental
processes"'. Nevertheless, the correlations have
some objective reality, because they can be
demonstrated experimentally — but again this is
only the appearance of objective reality. This
helps solve the problem of the order of time,
because an objective past cannot possibly be
influenced by future possibilities, although
this can appear to be the case within
'immaterial' 'human knowledge production'(148).
Schrader points out that this involves 'the same
insurmountable contradictions' as we saw earlier
with Szilard — first we eliminate human
activities, and then we reintroduce them as a
determining factor [deep inconsistency at the
heart of experimental practice].
We can make progress with this by rethinking
time. The arguments so far remain with
conventional definitions, [even with Böhr, she
says — the past might be questioned by arguing
the individual objects do not exist before
measurement, but the future remains
conventional] -- potential futures is a matter
of human choice. Just as Böhr challenge the idea
that the world consists of discrete objects, so
we must challenge the assumption, with Derrida,
that time is made up of successive linking of
presents [but this is a daft conception in the
first place]. The older conception implies a
human subject autonomously deciding to choose an
experimental setup before investigation, but, as
Kirby argues, the human is not separate from the
world it experiments on, not autonomous from
Derridean writing. [For me, these two are not
the same. If people are not autonomous with
respect to the operation of writing, that means
not that they are therefore a part of nature,
but rather a part of a linguistic system — the
autonomous individual is replaced by the
individual in language]. We must account for
'the materiality of human contributions to
measurement', via Barad and agential realism.
We can combine agential realism with Szilard's
notion of measurement as a coupling process with
memory. This will help us solve some problems
such as material traces existing before the
effects, as in quantum erasure, and the
'mysterious movements of a piston that nobody
caused'. First we have to agree with Szilard
that the coupling of measurement apparatus an
object is not a preparation, but actually part
of a measurement — this resolves the issue of
reversible preparation and an irreversible
recording process. We have also to replace the
idea of individual human choice and the point
about future generations and doing the results.
We need a new Demon of our own, 'to do some real
work'.
Agential realism replaces the awkward position
of human observers either as external, or as the
only thing that does knowledge production. These
'are just different sides of the same coin'
(149). For Barad, there is no external observer
nor an outside boundary of the experimental
apparatus [which raises its own problems,
because the apparatus now expands to include the
whole world?]. The subject is one agent of
observation emerging through specific cuts, as
is the boundary between subject and object.
Measurements do not discover what is already
been there nor do they create something entirely
new — 'they rather materially re(con)figure the
world', as with Haraway again.
If Szilard is right to say that the preparation
of an experiment includes measuring the value,
'there is no temporal or spatial distinction
between the taking place of a measurement and
the detection of the value' [but wasn't this
distinction between preparation and experiment
criticised just above?]. This means that 'the
production of human knowledge cannot be regarded
as ontologically distinct from the setting up of
an experiment'.
If we go back to the one molecule engine and see
the piston as a measurement device, embodying
the memory of an encounter with the molecule,
the piston is not detecting the pre-existing
location of the molecule, nor can it create it
retrospectively after being decoupled. Instead
it actually 'contributes to the definition of
the past location of the molecule' [maybe -- the
'left' location means' once the piston has
moved'?]. The intra-action is between inserting
a movable piston, and then tracking its movement
with the molecule [which helps us] 'give meaning
to the statement that the molecule was located
[in a particular place]'. The molecule did not
have a location before the piston was inserted
because 'there were not two sides yet' [it still
had a location, of course, although we were not
in a position to describe it in terms of
sides?]. Only when the piston is moved to the
right can we say that the molecule was on the
left. And we can't distinguish between left and
right before the piston moves [again highly
debatable]. Therefore being on the left side is
not 'an intrinsic feature of the molecule that
has to be somehow detected' (150) [but its
location within the cylinder is an intrinsic
feature of the molecule?]. Saying that it is on
the left side is meaningless 'neither true nor
false, and, most important, not potentially
true' until the piston moves to the right. 'It
is not a future possibility that awaits human
determination or decision' [except humans have
constructed the possibility for the piston to
move]. The molecule has to leave a material
trace first, so that memory gets embodied in the
piston, but once this happens, 'the molecule is
no longer confined to the left side of the
cylinder' [once the movement of the piston is
completed, that is, and if we have some absolute
notion of left side rather than one relative to
the piston]. As a result [really stretched
conclusion], the existence of memory presupposes
'the decoupling of the molecule's current
position from its "past"', a kind of 'delayed
defining event (150)'.
If the molecule is actually to push the piston,
we have to affirm that it '"really" was on one
side of the partition, and this is not just a
retrospective appearance. But what does this
'really happened' mean? The point is that
measurement 'is a material – discursive practice
to which not only humans contribute' [the piston
and molecule are contributing?] . Definitions
are not just human. What we see is 'an
intra-activity to which the entire experimental
apparatus contributes' — pistons, cylinders,
molecules, and even 'the history of
thermodynamics' [trivially true?]. Definition
itself is a 'material–discursive practice
producing a record'[so definition has become
measurement here?]. The whole material
arrangement which includes the experimenter is
what enacts the agential cut so that 'the
correlation between measurement apparatus and
object yields a definite value'. There is no
absolute separation between any of the
components. Together, they engender a trace. It
is not therefore strictly speaking a decoupling
[maybe], more a specification of a correlation,
a boundary making practice. These have now
finality for Barad but are 'nevertheless
("retroactively") causal' [a whole string of
musts here, I suspect to preserve coherence].
Measurement practices should be seen as both
causal and open-ended at the same time.
Retroactive causality does not mean that the
piston somehow causes the location of the
molecule — we can never affirm that the molecule
just is somewhere before the piston moves. The
agential cut that decouples 'defines the past as
memory that has never been present [before the
cut?]' (151) [looking suspiciously definitional
here]. It is like the Derrida trace, which, he
says can never just be reappropriated and turned
into a simple present. It 'must be thought as an
"originary repetition"' [God knows what that
means — that repetition is inherent somehow? If
we don't exactly know, this is only argument by
authority, of course]. We can still affirm that
a molecule really was on one side, but we should
not confuse causality with this Derridean link
between consecutively present moments, time as
an externally directed flow where only the
present is real [defines no known version of
human consciousness or scientific concept?]
We might need to change metaphors [sic] and talk
about writing and reading instead of coupling
and decoupling. Reading is not something that
only human subjects do. Larger experimental
apparatuses can act so as to produce a cut
between object and measurement agency. This
'reading defines the writing as that which
precedes it' [so we have deconstructed the
apparent self-sufficiency of writing in our
reading? Infinite regress here too if that
reading also becomes a writing for subsequent
readings etc]. Reading localises writing, seeing
it as 'a material trace associated with a
particular apparatus'. Reading does not cause
writing retrospectively. 'There is no memory
before it is read' . Causality is also an effect
of a reading [we discover it, or construct it by
reading?] [Somehow, this is not just a
linguistic operation, but] 'an expression of the
materiality of traces' and we must account for
them. 'Causality itself is not caused' [an old
paradox here]. We define it in a delayed way,
after reading has enacted a boundary 'and a
temporal order'. It is inevitably 'internally
haunted' because the past we specify is also
defined 'in reference to an open-ended future'
[because we recognise the agential nature of
reading and its local effects?].
To make things worse, readings do not just
precede or follow writings, and every reading is
a rewriting [writing goes all the way down — at
least until it stops with Derrida's philosophy].
This does not erase memories of past intra-
actions but produces it [the intra-action] as
something past, 'locating and dislocating it at
the same time'. A new record does not just add
to earlier ones because earlier ones are no
longer 'equally' [weasel] accessible. The world
does not become increasingly tangled but rather
'differently entangled' (152), and different
entanglements have different specifics.
Reading and writing requires work, it is
'material practice'. There is no external drive
to greater complexity [so what explains
scientific choice or intellectual development?].
Whenever we measure something we are specifying
the writing that determines the meaning of the
correlation and how it becomes (de)coupled [with
the usual caution, 'there is no coupling before
a decoupling']. Everything will depend on how we
think of an event, as a moment in time, or as
something 'always already spectral'. Latour is
right to say that the past can be changed in the
here and now, but Schrader wants to go further
and say 'the [conventionally objective and
independent] past has never been present nor
will it ever be present in terms of a future
present'. All the terms have to be
produced by work, and '"all work produces
spectrality"' say Derrida and Steigler [because
this is still linguistic work though?].
Scully et al argue that all measurements are
potentially given by an experimental setup and
that the human experimenter chooses the relevant
one. This is a matter of choice, and so
measurements are reversible. Greenberger and
YaSin assume that the outcome of a measurement
is still determined by physical disturbance,
giving a deterministic notion of reading — but
that becomes a problem when new knowledge
retrospectively changes the potentials. Both
groups see the 'immateriality of human
knowledge'. Both assume ' the givenness of the
past, that is, its presence' [is she
deliberately trying to confuse us with homonyms
and puns?]. We must abandon all these
assumptions. Measurements are haunted, so, to
use the old language, 'Demons contribute to
work'. There is no potential future that follows
automatically from the past. Instead,
measurements are 'internally haunted such that
work creates time' [agential cuts create
boundaries and temporal sequences?].
Measurements start with memories, material
traces, and they offer 'material
re-writings/readings'. Objects and measurement
devices 'do not exist independently before a
measurement', nor are they objectively
correlated 'before the action begins'. All
interactions leave traces 'But the trace by
itself is not' [has no independent existence].
The same goes for memories which have to be read
in order to exist. This all requires work, which
in turn means 'material determinations'. It is
only when agential cuts localise that we have a
notion of 'proper space and time' for memories.
We can reconfigure the past, 'but that does not
render it less real [real to us that is] ' [but
the reality has to have this demonic quality? Or
is she saying that once we have made an agential
cut, the event we have created is real, even
when it passes into the past, and subject to
rereading?]. Any rereading 'relies on the
materiality of traces' [only in empirical
science? You could extend the notion of
'material' of course]
The old arrow of time idea assumes a closed
system, disturbed only by something external,
producing reversible effects. This notion even
extends to quantum measurements, although here,
the assumption is that these are irreversible.
In both cases, a bounded system is assumed [she
says there is no substantive difference between
relatively open and relatively closed systems].
A closed system is often assumed in defining
what is 'natural' — something that operates by
itself, often heading towards equilibrium. But
'there are no systems by themselves' (153).
There can be no automatic entropy either [lots
of implications here for the heat death of the
universe?] — 'entropy is a feature not of a
system but of an experiment' [quoting somebody
called Jaynes] and experiments themselves are
'open-ended material discursive practices'.
We still need some notion of the Demon even
though we can actually see the molecules now.
However, the molecules do not occupy a
preestablished system. There are indeed limits
to the ability of observers, who can never move
outside finite part of the universe. Nor can we
'simultaneously be a part of nature and have an
external view of nature's activity', because we
actually enact system boundaries that we become
part of [so further twist to Kirby's assumption
that there is nothing outside nature?].
Everything depends on how boundaries are
constructed.
The idea of a working Demon will 'incorporate
the spectre of the "past" and memories of the
future'. Measurement is work, intra-activity,
omnipresent, and irreversible in the sense that
it is irreducible. It doesn't just accumulate in
time 'rather, it constitutes time'. Processes
are irreversible because material traces and
boundaries are irreducible [presumably meaning
not outside work? We are close to assuming an
arrow of time with work here though? When we
revisit the past, we don't go back to a less
complex reading?]. Transformative work is just
productive work, intra-activity. It is not just
something that is opposed by wasteful heat
[which was originally defined as transformation
work and therefore as something that doesn't
really count as work]
Haunted measurements show that the work of
scientists is not just a matter of operating on
components that are already there — it is always
transformative not just additive. It follows
that the history of science is no longer purely
productive. Those judgements also depend on what
we mean by time [because there is always an
assumption of productive work taking place
without wasting time?]. Nevertheless, every
intra-action produces 'possibilities of changing
its "direction"' [really idealistic about the
potential of science here? And the direction of
science is always towards complexity? Ignores
the social constraints of maintaining research
programmes etc --acknowledged in the Pfisteria
piece] ]. So we don't invalidate the Second Law,
and its meaning remains contested, despite all
the effort to exercise Demons.
[As usual, the notes are quite gripping setting
out the relation with Latour, for example or
describing the various attempts to pin down
Demons — note 6 tells us that the Demon is
nearly always masculine so his gender is
relevant in understanding his activities! Some
of the puzzling remarks about memory erasure
requiring work are explained in note 15 — memory
erasure is always logically irreversible. Note
19 says that she herself has read Böhr in a
particular way, influenced by Scully et al —
they have added the bit about the potential
future existence of objects.]
|
|