BestLightNovel.com

Logic: Deductive and Inductive Part 28

Logic: Deductive and Inductive - BestLightNovel.com

You’re reading novel Logic: Deductive and Inductive Part 28 online at BestLightNovel.com. Please use the follow button to get notification about the latest chapter next time when you visit BestLightNovel.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy

Still, some chance coincidences do recur according to laws of their own: I say _some_, but it may be all. If the world is finite, the possible combinations of its elements are exhaustible; and, in time, whatever conditions of the world have concurred will concur again, and in the same relation to former conditions. This writing, that cab, those chimes, those scales will coincide again; the Argonautic expedition, and the Trojan war, and all our other troubles will be renewed. But let us consider some more manageable instance, such as the throwing of dice.

Every one who has played much with dice knows that double sixes are sometimes thrown, and sometimes double aces. Such coincidences do not happen once and only once; they occur again and again, and a great number of trials will show that, though their recurrence has not the regularity of cause and effect, it yet has a law of its own, namely--a tendency to average regularity. In 10,000 throws there will be some number of double sixes; and the greater the number of throws the more closely will the average recurrence of double sixes, or double aces, approximate to one in thirty-six. Such a law of average recurrence is the basis of Probability. Chance being the fact of coincidence without a.s.signable cause, Probability is expectation based on the average frequency of its happening.

-- 2. Probability is an ambiguous term. Usually, when we say that an event is 'probable,' we mean that it is more likely than not to happen.

But, scientifically, an event is probable if our expectation of its occurrence is less than certainty, as long as the event is not impossible. Probability, thus conceived, is represented by a fraction.

Taking 1 to stand for certainty, and 0 for impossibility, probability may be 999/1000, or 1/1000, or (generally) 1/_m_. The denominator represents the number of times that an event happens, and the numerator the number of times that it coincides with another event. In throwing a die, the probability of ace turning up is expressed by putting the number of throws for the denominator and the number of times that ace is thrown for the numerator; and we may a.s.sume that the more trials we make the nearer will the resulting fraction approximate to 1/6.

Instead of speaking of the 'throwing of the die' and its 'turning up ace' as two events, the former is called 'the event' and the latter 'the way of its happening.' And these expressions may easily be extended to cover relations of distinct events; as when two men shoot at a mark and we desire to represent the probability of both hitting the bull's eye together, each shot may count as an event (denominator) and the coincidence of 'bull's-eyes' as the way of its happening (numerator).

It is also common to speak of probability as a proportion. If the fraction expressing the probability of ace being cast is 1/6, the proportion of cases in which it happens is 1 to 5; or (as it is, perhaps, still more commonly put) 'the chances are 5 to 1 against it.'

-- 3. As to the grounds of probability opinions differ. According to one view the ground is subjective: probability depends, it is said, upon the quant.i.ty of our Belief in the happening of a certain event, or in its happening in a particular way. According to the other view the ground is objective, and, in fact, is nothing else than experience, which is most trustworthy when carefully expressed in statistics.

To the subjective view it may be objected, (a) that belief cannot by itself be satisfactorily measured. No one will maintain that belief, merely as a state of mind, always has a definite numerical value of which one is conscious, as 1/100 or 1/10. Let anybody mix a number of letters in a bag, knowing nothing of them except that one of them is X, and then draw them one by one, endeavouring each time to estimate the value of his belief that the next will be X; can he say that his belief in the drawing of X next time regularly increases as the number of letters left decreases?

If not, we see that (b) belief does not uniformly correspond with the state of the facts. If in such a trial as proposed above, we really wish to draw X, as when looking for something in a number of boxes, how common it is, after a few failures, to feel quite hopeless and to say: "Oh, of course it will be in the last." For belief is subject to hope and fear, temperament, pa.s.sion, and prejudice, and not merely to rational considerations. And it is useless to appeal to 'the Wise Man,'

the purely rational judge of probability, unless he is producible. Or, if it be said that belief is a short cut to the evaluation of experience, because it is the resultant of all past experience, we may reply that this is not true. For one striking experience, or two or three recent ones, will immensely outweigh a great number of faint or remote experiences. Moreover, the experience of two men may be practically equal, whilst their beliefs upon any question greatly differ. Any two Englishmen have about the same experience, personal and ancestral, of the weather; yet their beliefs in the saw that 'if it rain on St. Swithin's Day it will rain for forty days after,' may differ as confident expectation and sheer scepticism. Upon which of these beliefs shall we ground the probability of forty days' rain?

But (c) at any rate, if Probability is to be connected with Inductive Logic, it must rest upon the same ground, namely--observation.

Induction, in any particular case, is not content with beliefs or opinions, but aims at testing, verifying or correcting them by appealing to the facts; and Probability has the same object and the same basis.

In some cases, indeed, the conditions of an event are supposed to be mathematically predetermined, as in tossing a penny, throwing dice, dealing cards. In throwing a die, the ways of happening are six; in tossing a penny only two, head and tail: and we usually a.s.sume that the odds with a die are fairly 5 to 1 against ace, whilst with a penny 'the betting is even' on head or tail. Still, this a.s.sumption rests upon another, that the die is perfectly fair, or that the head and tail of a penny are exactly alike; and this is not true. With an ordinary die or penny, a very great number of trials would, no doubt, give an average approximating to 1/6 or 1/2; yet might always leave a certain excess one way or the other, which would also become more definite as the trials went on; thus showing that the die or penny did not satisfy the mathematical hypothesis. Buffon is said to have tossed a coin 4040 times, obtaining 1992 heads and 2048 tails; a pupil of De Morgan tossed 4092 times, obtaining 2048 heads and 2044 tails.

There are other important cases in which probability is estimated and numerically expressed, although statistical evidence directly bearing upon the point in question cannot be obtained; as in betting upon a race; or in the prices of stocks and shares, which are supposed to represent the probability of their paying, or continuing to pay, a certain rate of interest. But the judgment of experts in such matters is certainly based upon experience; and great pains are taken to make the evidence as definite as possible by comparing records of speed, or by financial estimates; though something must still be allowed for reports of the condition of horses, or of the prospects of war, harvests, etc.

However, where statistical evidence is obtainable, no one dreams of estimating probability by the quant.i.ty of his belief. Insurance offices, dealing with fire, s.h.i.+pwreck, death, accident, etc., prepare elaborate statistics of these events, and regulate their rates accordingly. Apart from statistics, at what rate ought the lives of men aged 40 to be insured, in order to leave a profit of 5 per cent. upon 1000 payable at each man's death? Is 'quant.i.ty of belief' a sufficient basis for doing this sum?

-- 4. The ground of probability is experience, then, and, whenever possible, statistics; which are a kind of induction. It has indeed been urged that induction is itself based upon probability; that the subtlety, complexity and secrecy of nature are such, that we are never quite sure that we fully know even what we have observed; and that, as for laws, the conditions of the universe at large may at any moment be completely changed; so that all imperfect inductions, including the law of causation itself, are only probable. But, clearly, this doctrine turns upon another ambiguity in the word 'probable.' It may be used in the sense of 'less than absolutely certain'; and such doubtless is the condition of all human knowledge, in comparison with the comprehensive intuition of arch-angels: or it may mean 'less than certain according to _our_ standard of certainty,' that is, in comparison with the law of causation and its derivatives.

We may suppose some one to object that "by this relative standard even empirical laws cannot be called 'only probable' as long as we 'know no exception to them'; for that is all that can be said for the boasted law of causation; and that, accordingly, we can frame no fraction to represent their probability. That 'all swans are white' was at one time, from this point of view, not probable but certain; though we now know it to be false. It would have been an indecorum to call it only probable as long as no other-coloured swan had been discovered; not merely because the quant.i.ty of belief amounted to certainty, but because the number of events (seeing a swan) and the number of their happenings in a certain way (being white) were equal, and therefore the evidence amounted to 1 or certainty." But, in fact, such an empirical law is only probable; and the estimate of its probability must be based on the number of times that similar laws have been found liable to exceptions. Albinism is of frequent occurrence; and it is common to find closely allied varieties of animals differing in colour. Had the evidence been duly weighed, it could never have seemed more than probable that 'all swans are white.'

But what law, approaching the comprehensiveness of the law of causation, presents any exceptions?

Supposing evidence to be ultimately nothing but acc.u.mulated experience, the amount of it in favour of causation is incomparably greater than the most that has ever been advanced to show the probability of any other kind of event; and every relation of events which is shown to have the marks of causation obtains the support of that incomparably greater body of evidence. Hence the only way in which causation can be called probable, for us, is by considering it as the upward limit (1) to which the series of probabilities tends; as impossibility is the downward limit (0). Induction, 'humanly speaking,' does not rest on probability; but the probability of concrete events (not of mere mathematical abstractions like the falling of absolutely true dice) rests on induction and, therefore, on causation. The inductive evidence underlying an estimate of probability may be of three kinds: (a) direct statistics of the events in question; as when we find that, at the age of 20, the average expectation of life is 39-40 years. This is an empirical law, and, if we do not know the causes of any event, we must be content with an empirical law. But (b) if we do know the causes of an event, and the causes which may prevent its happening, and can estimate the comparative frequency of their occurring, we may deduce the probability that the effect (that is, the event in question) will occur.

Or (c) we may combine these two methods, verifying each by means of the other. Now either the method (b) or (_a fortiori_) the method (c) (both depending on causation) is more trustworthy than the method (a) by itself.

But, further, a merely empirical statistical law will only be true as long as the causes influencing the event remain the same. A die may be found to turn ace once in six throws, on the average, in close accordance with mathematical theory; but if we load it on that facet the results will be very different. So it is with the expectation of life, or fire, or s.h.i.+pwreck. The increased virulence of some epidemic such as influenza, an outbreak of anarchic incendiarism, a moral epidemic of over-loading s.h.i.+ps, may deceive the hopes of insurance offices. Hence we see, again, that probability depends upon causation, not causation upon probability.

That uncertainty of an event which arises not from ignorance of the law of its cause, but from our not knowing whether the cause itself does or does not occur at any particular time, is Contingency.

-- 5. The nature of an average supposes deviations from it. Deviations from an average, or "errors," are a.s.sumed to conform to the law (1) that the greater errors are less frequent than the smaller, so that most events approximate to the average; and (2) that errors have no "bias,"

but are equally frequent and equally great in both directions from the mean, so that they are scattered symmetrically. Hence their distribution may be expressed by some such figure as the following:

[Ill.u.s.tration: FIG. 11.]

Here _o_ is the average event, and _oy_ represents the number of average events. Along _ox_, in either direction, deviations are measured. At _p_ the amount of error or deviation is _op_; and the number of such deviations is represented by the line or ordinate _pa_. At _s_ the deviation is _os_; and the number of such deviations is expressed by _sb_. As the deviations grow greater, the number of them grows less. On the other side of _o_, toward _-x_, at distances, _op'_, _os'_ (equal to _op_, _os_) the lines _p'a'_, _s'b'_ represent the numbers of those errors (equal to _pa_, _sb_).

If _o_ is the average height of the adult men of a nation, (say) 5 ft. 6 in., _s'_ and _s_ may stand for 5 ft. and 6 ft.; men of 4 ft. 6 in. lie further toward _-x_, and men of 6 ft. 6 in. further toward _x_. There are limits to the stature of human beings (or to any kind of animal or plant) in both directions, because of the physical conditions of generation and birth. With such events the curve _b'yb_ meets the abscissa at some point in each direction; though where this occurs can only be known by continually measuring dwarfs and giants. But in throwing dice or tossing coins, whilst the average occurrence of ace is once in six throws, and the average occurrence of 'tail' is once in two tosses, there is no necessary limit to the sequences of ace or of 'tail'

that may occur in an infinite number of trials. To provide for such cases the curve is drawn as if it never touched the abscissa.

That some such figure as that given above describes a frequent characteristic of an average with the deviations from it, may be shown in two ways: (1) By arranging the statistical results of any h.o.m.ogeneous cla.s.s of measurements; when it is often found that they do, in fact, approximately conform to the figure; that very many events are near the average; that errors are symmetrically distributed on either side, and that the greater errors are the rarer. (2) By mathematical demonstration based upon the supposition that each of the events in question is influenced, more or less, by a number of unknown conditions common to them all, and that these conditions are independent of one another. For then, in rare cases, all the conditions will operate favourably in one way, and the men will be tall; or in the opposite way, and the men will be short; in more numerous cases, many of the conditions will operate in one direction, and will be partially cancelled by a few opposing them; whilst in still more cases opposed conditions will approximately balance one another and produce the average event or something near it. The results will then conform to the above figure.

From the above a.s.sumption it follows that the symmetrical curve describes only a 'h.o.m.ogeneous cla.s.s' of measurements; that is, a cla.s.s no portion of which is much influenced by conditions peculiar to itself.

If the cla.s.s is not h.o.m.ogeneous, because some portion of it is subject to _peculiar_ conditions, the curve will show a hump on one side or the other. Suppose we are tabulating the ages at which Englishmen die who have reached the age of 20, we may find that the greatest number die at 39 (19 years being the average expectation of life at 20) and that as far as that age the curve upwards is regular, and that beyond the age of 39 it begins to descend regularly, but that on approaching 45 it bulges out some way before resuming its regular descent--thus:

[Ill.u.s.tration: FIG. 12.]

Such a hump in the curve might be due to the presence of a considerable body of teetotalers, whose longevity was increased by the peculiar condition of abstaining from alcohol, and whose average age was 45, 6 years more than the average for common men.

Again, if the group we are measuring be subject to selection (such as British soldiers, for which profession all volunteers below a certain height--say, 5 ft. 5 in.--are rejected), the curve will fall steeply on one side, thus:

[Ill.u.s.tration: FIG. 13.]

If, above a certain height, volunteers are also rejected, the curve will fall abruptly on both sides. The average is supposed to be 5 ft. 8 in.

The distribution of events is described by 'some such curve' as that given in Fig. 11; but different groups of events may present figures or surfaces in which the slopes of the curves are very different, namely, more or less steep; and if the curve is very steep, the figure runs into a peak; whereas, if the curve is gradual, the figure is comparatively flat. In the latter case, where the figure is flat, fewer events will closely cl.u.s.ter about the average, and the deviations will be greater.

Suppose that we know nothing of a given event except that it belongs to a certain cla.s.s or series, what can we venture to infer of it from our knowledge of the series? Let the event be the cephalic index of an Englishman. The cephalic index is the breadth of a skull 100 and divided by the length of it; e.g. if a skull is 8 in. long and 6 in.

broad, (6100)/8=75. We know that the average English skull has an index of 78. The skull of the given individual, therefore, is more likely to have that index than any other. Still, many skulls deviate from the average, and we should like to know what is the probable error in this case. The probable error is the measurement that divides the deviations from the average in either direction into halves, so that there are as many events having a greater deviation as there are events having a less deviation. If, in Fig. 11 above, we have arranged the measurements of the cephalic index of English adult males, and if at _o_ (the average or mean) the index is 78, and if the line _pa_ divides the right side of the fig. into halves, then _op_ is the probable error. If the measurement at _p_ is 80, the probable error is 2. Similarly, on the left hand, the probable error is _op'_, and the measurement at _p'_ is 76. We may infer, then, that the skull of the man before us is more likely to have an index of 78 than any other; if any other, it is equally likely to lie between 80 and 76, or to lie outside them; but as the numbers rise above 80 to the right, or fall below 76 to the left, it rapidly becomes less and less likely that they describe this skull.

In such cases as heights of men or skull measurements, where great numbers of specimens exist, the average will be actually presented by many of them; but if we take a small group, such as the measurements of a college cla.s.s, it may happen that the average height (say, 5 ft. 8 in.) is not the actual height of any one man. Even then there will generally be a closer cl.u.s.ter of the actual heights about that number than about any other. Still, with very few cases before us, it may be better to take the median than the average. The median is that event on either side of which there are equal numbers of deviations. One advantage of this procedure is that it may save time and trouble. To find approximately the average height of a cla.s.s, arrange the men in order of height, take the middle one and measure him. A further advantage of this method is that it excludes the influence of extraordinary deviations. Suppose we have seven cephalic indices, from skeletons found in the same barrow, 75-1/2, 76, 78, 78, 79, 80-1/2, 86.

The average is 79; but this number is swollen unduly by the last measurement; and the median, 78, is more fairly representative of the series; that is to say, with a greater number of skulls the average would probably have been nearer 78.

To make a single measurement of a phenomenon does not give one much confidence. Another measurement is made; and then, if there is no opportunity for more, one takes the mean or average of the two. But why?

For the result may certainly be worse than the first measurement.

Suppose that the events I am measuring are in fact fairly described by Fig. II, although (at the outset) I know nothing about them; and that my first measurement gives _p_, and my second _s_; the average of them is worse than _p_. Still, being yet ignorant of the distribution of these events, I do rightly in taking the average. For, as it happens, 3/4 of the events lie to the left of _p_; so that if the first trial gives _p_, then the average of _p_ and any subsequent trial that fell nearer than (say) _s'_ on the opposite side, would be better than _p_; and since deviations greater than _s'_ are rare, the chances are nearly 3 to 1 that the taking of an average will improve the observation. Only if the first trial give _o_, or fall within a little more than 1/2 _p_ on either side of _o_, will the chances be against any improvement by trying again and taking an average. Since, therefore, we cannot know the position of our first trial in relation to _o_, it is always prudent to try again and take the average; and the more trials we can make and average, the better is the result. The average of a number of observations is called a "Reduced Observation."

We may have reason to believe that some of our measurements are better than others because they have been taken by a better trained observer, or by the same observer in a more deliberate way, or with better instruments, and so forth. If so, such observations should be 'weighted,' or given more importance in our calculations; and a simple way of doing this is to count them twice or oftener in taking the average.

-- 6. These considerations have an important bearing upon the interpretation of probabilities. The average probability for any _general cla.s.s_ or series of events cannot be confidently applied to any _one instance_ or to any _special cla.s.s_ of instances, since this one, or this special cla.s.s, may exhibit a striking error or deviation; it may, in fact, be subject to special causes. Within the cla.s.s whose average is first taken, and which is described by general characters as 'a man,' or 'a die,' or 'a rifle shot,' there may be cla.s.ses marked by special characters and determined by special influences. Statistics giving the average for 'mankind' may not be true of 'civilised men,' or of any still smaller cla.s.s such as 'Frenchmen.' Hence life-insurance offices rely not merely on statistics of life and death in general, but collect special evidence in respect of different ages and s.e.xes, and make further allowance for teetotalism, inherited disease, etc.

Similarly with individual cases: the average expectation for a cla.s.s, whether general or special, is only applicable to any particular case if that case is adequately described by the cla.s.s characters. In England, for example, the average expectation of life for males at 20 years of age is 39.40; but at 60 it is still 13.14, and at 73 it is 7.07; at 100 it's 1.61. Of men 20 years old those who live more or less than 39.40 years are deviations or errors; but there are a great many of them. To insure the life of a single man at 20, in the expectation of his dying at 60, would be a mere bet, if we had no special knowledge of him; the safety of an insurance office lies in having so many clients that opposite deviations cancel one another: the more clients the safer the business. It is quite possible that a hundred men aged 20 should be insured in one week and all of them die before 25; this would be ruinous, if others did not live to be 80 or 90.

Not only in such a practical affair as insurance, but in matters purely scientific, the minute and subtle peculiarities of individuals have important consequences. Each man has a certain cast of mind, character, physique, giving a distinctive turn to all his actions even when he tries to be normal. In every employment this determines his Personal Equation, or average deviation from the normal. The term Personal Equation is used chiefly in connection with scientific observation, as in Astronomy. Each observer is liable to be a little wrong, and this error has to be allowed for and his observations corrected accordingly.

The use of the term 'expectation,' and of examples drawn from insurance and gambling, may convey the notion that probability relates entirely to future events; but if based on laws and causes, it can have no reference to point of time. As long as conditions are the same, events will be the same, whether we consider uniformities or averages. We may therefore draw probable inferences concerning the past as well as the future, subject to the same hypothesis, that the causes affecting the events in question were the same and similarly combined. On the other hand, if we know that conditions bearing on the subject of investigation, have changed since statistics were collected, or were different at some time previous to the collection of evidence, every probable inference based on those statistics must be corrected by allowing for the altered conditions, whether we desire to reason forwards or backwards in time.

-- 7. The rules for the combination of probabilities are as follows:

(1) If two events or causes do not concur, the probability of one or the other occurring is the sum of the separate probabilities. A die cannot turn up both ace and six; but the probability in favour of each is 1/6: therefore, the probability in favour of one or the other is 1/3. Death can hardly occur from both burning and drowning: if 1 in 1000 is burned and 2 in 1000 are drowned, the probability of being burned or drowned is 3/1000.

(2) If two events are independent, having neither connection nor repugnance, the probability of their concurring is found by multiplying together the separate probabilities of each occurring. If in walking down a certain street I meet A once in four times, and B once in three times, I ought (by mere chance) to meet both once in twelve times: for in twelve occasions I meet B four times; but once in four I meet A.

This is a very important rule in scientific investigation, since it enables us to detect the presence of causation. For if the coincidence of two events is more or less frequent than it would be if they were entirely independent, there is either connection or repugnance between them. If, e.g., in walking down the street I meet both A and B oftener than once in twelve times, they may be engaged in similar business, calling them from their offices at about the same hour. If I meet them both less often than once in twelve times, they may belong to the same office, where one acts as a subst.i.tute for the other. Similarly, if in a mult.i.tude of throws a die turns six oftener than once in six times, it is not a fair one: that is, there is a cause favouring the turning of six. If of 20,000 people 500 see apparitions and 100 have friends murdered, the chance of any man having both experiences is 1/8000; but if each lives on the average 300,000 hours, the chance of both events occurring in the same hour is 1/2400000000. If the two events occur in the same hour oftener than this, there is more than a chance coincidence.

The more minute a cause of connection or repugnance between events, the longer the series of trials or instances necessary to bring out its influence: the less a die is loaded, the more casts must be made before it can be shown that a certain side tends to recur oftener than once in six.

(3) The rule for calculating the probability of a dependent event is the same as the above; for the concurrence of two independent events is itself dependent upon each of them occurring. My meeting with both A and B in the street is dependent on my walking there and on my meeting one of them. Similarly, if A is sometimes a cause of B (though liable to be frustrated), and B sometimes of C (C and B having no causes independent of B and A respectively), the occurrence of C is dependent on that of B, and that again on the occurrence of A. Hence we may state the rule: If two events are dependent each on another, so that if one occur the second may (or may not), and if the second a third; whilst the third never occurs without the second, nor the second without the first; the probability that if the first occur the third will, is found by multiplying together the fractions expressing the probability that the first is a mark of the second and the second of the third.

Upon this principle the value of hearsay evidence or tradition deteriorates, and generally the cogency of any argument based upon the combination of approximate generalisations dependent on one another or "self-infirmative." If there are two witnesses, A and B, of whom A saw an event, whilst B only heard A relate it (and is therefore dependent on A), what credit is due to B's recital? Suppose the probability of each man's being correct as to what he says he saw, or heard, is 3/4: then (3/4 3/4 = 9/16) the probability that B's story is true is a little more than 1/2. For if in 16 attestations A is wrong 4 times, B can only be right in 3/4 of the remainder, or 9 times in 16. Again, if we have the Approximate Generalisations, 'Most attempts to reduce wages are met by strikes,' and 'Most strikes are successful,' and learn, on statistical inquiry, that in every hundred attempts to reduce wages there are 80 strikes, and that 70 p.c. of the strikes are successful, then 56 p.c. of attempts to reduce wages are unsuccessful.

Please click Like and leave more comments to support and keep us alive.

RECENTLY UPDATED MANGA

Logic: Deductive and Inductive Part 28 summary

You're reading Logic: Deductive and Inductive. This manga has been translated by Updating. Author(s): Carveth Read. Already has 840 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

BestLightNovel.com is a most smartest website for reading manga online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to BestLightNovel.com