unsubscribe
Arthur Jackson (ajackson@SSA.CO.SANTA-CLARA.CA.US)
Thu, 19 Nov 1998 07:53:25 -0800
Don Mikulecky wrote:
>
> Don Mikulecky replies:
>
> Bruce Edmonds wrote:
>
> > Thanks for your reply Don, this is much clearer than other stuff I have
> > read.
> >
> > > Don Mikulecky replies:
> > > If I can summarize Rosen's "Fundamentals of Measurement" briefly ...
> > > ... Reductionist methods seem to
> > > eliminate the problem by reducing systems to simple physical models which
> > > seem totally compatable with measurement. In a nutshell, what is lost
here
> > > is the identification of functional entities which have their identity
> > > defined by the context provided by the system. reducing the system for
> > > measurement eliminates that context and thus makes the system seem simple.
> >
> > Right, if I may paraphrase: the data gained by measurement is iteself a
> > model of the phenomena under study (the 'data model'), the measuring
> > apparatus itself forms the encoding relation - an apparatus we have
> > designed. The modelling language is (typically) numerical (or an other
> > formal language). Such maths necessarily is context-free (thats why we
> > use it - so we can do unlimited inference in it, this is its limitation
> > and its power). It is this loss of context which causes the loss of
> > meaning. Thus the formal modelling we do is already in the realm of the
> > artificial (starting from the data model), but we actually do a lot of
> > implicit modelling with our measuring apparatus.
>
> It goes even further than that. Implicit "manipulation" of the data is a
> natural
> function of perception. We are dealing with the easy parts. You might say
that
> the choice of model format has the posibility of both conscious and
unconscious
> contributions.
>
> >
> >
> > There is a slight wrinkle here, for it seems to me that the data is
> > reinterpreted in two indirect ways: (1) via a (meaning-laden and
> > context-dependent) analogical model and (2) via the predictions of
> > future measurements via a phenomenological model of the data (which is
> > ususally in some formal inferential system).
> >
> > The analogical models are important becuase they give meaning to the
> > whole enterprise and help use direct our scientific actions (disparingly
> > called 'heuristics'). This is part of what is missing from the
> > formalism.
>
> yes....among other things
>
> >
> >
> > > ... Clearly, one does not just "measure".
> > > Measurements are abstractions DESIGNED to fit certain formal descriptions
in
> > > the modeling relation. Thus what we observe is NECESSARILY dictated by
our
> > > choice of formal system. More subtile is the fact that the encoding and
> > > decoding are independent mental activities which are in no way formulated
by
> > > the formalism.
> >
> > [I think 'dictated' is a bit over-strong here - I do not think we can
> > *totally* dictate what we observe, 'inevitabley constrained and shaped
> > by' might be more accurate
>
> yes..it's really more interactive...the formalism is also "dictated" by what
we
> can measure
>
> > In my terms the phenomena are
> > approximated&fitted to an entity (a datum) in a modelling language (via
> > our measuring) so both phenomena and modelling language (and the
> > modelling relations implicit in the measurment process) together dictate
> > the data. The way these all interact is complicated, no one part
> > entirely dominates.]
> >
> > I think there are some parts of the philosophy of science which would
> > interest you (you have probably come accross the second already):
> >
> > Bogen, J and Woodward, J (1988). Saving the Phenomena. The philosophical
> > review, 48:303-352. - Makes a clear distinction between data and
> > phenomena, the data are used for prediction the phenomena in
> > explanation.
> >
> > Cartwright, N (1983). How the Laws of Physics Lie. Oxford: Oxford
> > Univerity Press. - Shows how, in practice, physicists use a variety of
> > types of models including: phenomenological laws (which are true of the
> > data but don't explain much) and theoretical laws (which are used in
> > explanation but don't predict).
>
> Thanks...will get them
>
> >
> >
> > Anyway, to conclude, nobody thinks they *themselves* have a problem:
> > most scientists (typically positivist & realist) don't think about it
> > much and just do the stuff, philosophically minded reductionists will
> > discount the meaning-laden stuff as mere heuristics which will be
> > eventually replaced by 'proper' science, the holists don't see the fact
> > we only have models of reality a problem - this is how things are.
> >
> > The 'measurement problem' is something the holists think the
> > reductionists *should* have (but the reductionists don't actually
> > care). Is that it?
>
> positivism and reductionism have worked out a beautiful scheme for themselves.
> If
> the method won't work, it is not science (we call this "science of method").
> The
> alternative seems clear to us....a science of CONTENT. Note that this
> profoundly
> changes the nature of any concept of objectivity.In the science of
method....the
> method is what defines and assures "objectivity". Anything else is thereby
> "subjective"....and by inference, of lesser value.
>
> In the science of content, the modeling relation dominates in the quest for
> "truth" (a workable picture of nature)and the notion of objectivity becomes
> irrelevant since what we seek is method independent.
>
> enough radicalism for now.
>
> >
> >
> > Regards.
> >
> > PS. Which is the first/most basic Rosen ref. that deals with the
>
> > modelling relation?
>
> > chapters 2-5 in "Anticipatory Systems"
> > --------------------------------------------------
> > Bruce Edmonds,
> > Centre for Policy Modelling,
> > Manchester Metropolitan University, Aytoun Bldg.,
> > Aytoun St., Manchester, M1 3GH. UK.
> > Tel: +44 161 247 6479 Fax: +44 161 247 6802
> > http://bruce.edmonds.name
>
> best wishes,
> Don