Is it really surprising that almost all behavior, and almost all human behavior in general, can be considered in a scientific light?
I’ve heard that before, but I’ve always wondered exactly what people mean by it. There are many examples of people who once believed but now don’t, some even posting here regularly. Are you saying that they lacked “true faith”?
erl
I do not find it at all surprising that things which are not science fall within the methodological boundaries which Mr S has proposed. That has been my position all along.
Mr S
Upon consideration, I regret harping upon your inclusion of fetishistic in your list of my “offenses”. That I had withdrawn the comment and apologized for it long before is entirely inconsequential. Please pay no mind to my remarks on that subject.
Spiritus:**
[QUOTE]
[ul] [li]The question of yours which I found frauydulent and offensive was posted before all of the remarks you list save for one. [/li]
[li]That one has been apologized for, and you have acepted teh apology.[/li]
[li]Including that claim in your list of grievances is yet another false claim on your part.[/ul]**[/li][/QUOTE]
I stand corrected. Please allow me to rephrase:
*That’s rich. You insinuate that I am a monumental, foolish, obsessive, fetishistic idiot, and then conclude your post by complaining that you feel insulted by me. *
Is that better?
I see. You now insinuate, on top of the above, that I am also graceless and cowardly. The list of my bad qualities, apparently, just goes on and on.
**Well, if I am to stick with my definition, my original answer would have read, “Yes – but only in combination with a measuring instrument..” I know that you seem to be having a difficult time grasping this point, but fer God’s sake, it ain’t rocket science.
I suspect that this misunderstanding, that seems to have puzzled others here as well, is based at least in part on the way in which I originally worded my definition. The post was poorly worded on several counts, as I’ve come to understand belatedly, and I unconditionally apologize to anyone who found it confusing.
However, there have been several good examples posted in this thread that seem to have proved me wrong. I am no longer convinced of the validity of my definition; but erl’s, which is considerably more precise, might be worth critical inspection. Anyway, I’m on the verge of relinquishing this definition, in which case I would even recant on the conditional stated above.
Me neither. Nor am I unclear about dancing around board rules, innuendo, or any of the other myriad tactics you seem to use when things don’t go your way. To call another poster’s comments idiotic is insulting, degrading, and disrespectful. Although we have been in disagreement, at least I haven’t stooped that low, yet (although I’m getting close). Nor do I intend to.
The rules here allow one to say almost anything one wants about a post, as long as one does not say it directly about a poster. This permits one a great deal of latitude; but beyond that is a concept know as common courtesy, which is a trait sadly lacking in your last few responses. So specifically, I’m not accusing you of breaking any rules; I’m simply letting you know that as this discussion has gone on, I’ve found your style of posting to be more and more offensive.
No. Rather, I am defending myself from your apparently baseless claim that I engage in rhetorical arguments and reducto ad absurdum, and attempting to communicate to you that I would prefer a straightforward, open, honest debate without rancor, ill-will, or name-calling. I don’t suppose that would be too much to ask for in this forum, would it?
Frankly, yes. I feel that you have been condescending, if not downright patronizing, towards me.
Not worth the bandwidth. At this point, I’m not even sure why I’m bothering to respond to you at all.
Just don’t like it when folks try to bully me around, I guess.
Oh, I like this one. If I’ve misrepresented your posts I apologize, unconditionally; it is not my intention, as I’ve stated on numerous occasions, although it strikes me as strange that you are the only person in the thread who feels that I suffer from this particular tendency. And don’t worry; you haven’t hurt my feelings even slightly. I’ve just found our conversion to be more and more pointless, and irritating, as time has gone on.
Very well:
- In a response to starryspice, I wrote: * However, let me put it to you this way: how do you differentiate your guesswork concerning “the nature of the system” in one of your laboratory experiments, from your guesswork concerning, say, your next door neighbor’s love life? After all, both are theories that you may have about phenomena that have presented themselves to you as an experiencing subject.
I submit that the major difference between these two kinds of guesswork, which allows us to classify the first as science and the second as speculative gossip, is that the former builds primarily up from a basis in instrumental measurement.*
This quote can be found in my first reply to the thread after the OP.
- In response to Lib: * I would agree that the ultimate instrument of inquiry is the human brain, but I still feel at the intuitive level that one can nevertheless categorize this brain’s modes of inquiry into scientific and non-scientific. Since we regularly categorize knowledge products into one of the proceeding two boxes, it is not unreasonable to discuss their meaning. I’m trying to argue that when the human brain investigates its surroundings by means of instrumental measurement, we can call that science.* Unless, of course, you wish to disassociate the human brain and sensory impressions, I’d say that was pretty clear. In continuation, I point out: * Ultimately, all knowledge is mediated by the subject, usually (although not always) in the form of a narrative. But at the risk of sounding repetitive, the question remains: is that knowledge acquired from direct interaction with Nature, or through the filter of a measuring device? * Obviously, I consider science to be a subcategory of knowledge.
3 & 4) Twice I wrote, directly to you: * I have nothing in particular against what I perceive to be your strictly phenomenological approach to this question, nor would I deny that all knowledge is mediated, ultimately, by the subject. * Again, I consider science a subdivision of knowledge. Thus, by this passage I meant “science, as subdivision of knowledge, is ultimately mediated by the subject (i.e., by sensory impressions).” I thought that was pretty clear, but on review I see how it could be misconstrued.
-
Further down: * To reiterate: ultimately, all knowledge is mediated by the subject. Someone is looking at all those spots on the graph. What differentiates scientific inquiry from other forms of inquiry, I claim, is that while the latter derives info by other means, science derives it by subjectively inspecting the results presented by a measuring instrument.* This seems even clearer.
-
Realizing that we were getting off track regarding this question, I presented you with a schematic. It looked like this:
-
- Subject ------> Instrument -------> Object; or
/////////////// ---------------> Instrument
- Subject ------> Instrument -------> Object; or
- Subject
////////////// ----------------> Object *
You may note specifically, example 2, in which the subject is directly observing the object, against “the background,” as it were, of some kind of measuring instrument. Admittedly, my central focus has been 1), but as far as I am aware, I have not dismissed 2) as non-scientific.
- You write:**
I respond, “Undoubtedly true.” I then go on to argue that even though this is the case, what makes the observations scientific is that they were done in collaboration with a measuring instrument. Whatever else might be the case, I did not in my argument deny that the sensory impressions of the scientists in question were not a valid element of their scientific research. I then try to explain, vainly, that I have not “removed human sensory impressions from the equation.” Rather than pausing to ask me what I meant by this, since we seem to be at cross-purposes, you simply rephrased your claim: “* You have, repeatedly, said that you do not consider direct human observations to be within the realm of science*.” What I have tried to say, repeatedly, is that I do not consider direct human observations alone, without reference to an instrument of measurement, to be within the realm of science.
- Finally, the example of Og the caveman is predicated on a combination of measurement and sensory impressions.
I hope those are enough examples for you.
This is it from me, by the way. I have grown weary of this childish bickering, and will not continue to engage in it with you any further.
But what I mean, Spiritus, is that almost all behavior should be able to be considered in a scientific manner. Any definition for science which fails to account for that fails. Then the question becomes, is there a dividing line for good and bad science contained within the definition, or is that external? Of course I think Mr S’s definition requires external ideas to combat astrology and promote particle physics simultaneously. But is that really a flaw in it? Bad science, IMO, is still science, just like “immorality” falls under the scope of a moral system.
Mr S
Your persistence in imagining that I am speaking pf phenomenology when addressing human senses as valid means for gathering scientific data borders upon the comical. I have said:[ul]
[li]Human sensory impressions have been historically, and remain, a valid element of scientific inquiry.[/li][li]my position is not, as you seem to think, that you are wrong because of the inescapable phenomenological truths of human existence. You are wrong because direct human perceptions have been and remain a valid method for gathering scientific data. [/li][li]However, if you wish me to repeat some sciences in which direct human observations have been and remain valid methods for obtaining data, then I will offer: taxonomy, biology, chemistry, astronomy, and physics. [/li][li]As to my claim: human senses have been and remain valid means for gathering scientific data. That is a simple statement, despite your feelings. [/li][li]since we have both made it clear that issues of phenomenological necessity are not within the context of our discussion. If human sensory impressions are not a means of gathering scientific information, then you have “removed them from the equation” when discussing a scientific endeavor. [/li][li]Let me ask you plainly–do you agree that human senses can be valid instruments for recording scientific data? [/ul][/li]
My position is not phenomenological, despite your repeated attempts to paint me with that brush. Human beings, looking directly at nature without intervening artificial instruments, are capable of undertaking scientific investigations. I have little hope that you will understand this statement more clearly than the ones above, yet I perversely find myself typing the words. Obsessive attachment is not solely the province of Mr S, it seems.
It is no more accurate, but if you feel that any criticism of your posts is necessarily an insinuation on your character then you are welcome to bear your crosses in whatever manner you choose.
I find conditional apologies to be graceless and cowardly. I also find it graceless to refuse an offered apology, no matter the form, if it is perceived as sincere. From this you might infer that I do not make it a policy to be always graceful. Or you could nurse a plethora of perceived personal insults, of course. It maters little to me.
I have no trouble at all grasping this point. I simply disagree with it. You, on the other hand, seem incapable of understanding that it is exactly this point with which I disagree, not the phenolmenologically-based phantasm which haunts your posts.
I agree, it ain’t rocket science. But it does apparently fly higher than you can grasp.
Why would you imagine that I feel things are not going my way? You have offered no response to such counterexamples to your definition as taxonomy and ethology, nor have you addressed the insufficiency inherent in bindong only one variable when your field is defined over three, nor have you addressed the failure of your definition to exclude such things as electronic music production. Perhaps you feel that failing to secure your argument against such points grants you the upper hand, but I assure you that I feel no desperation at my rhetorical plight.
As to my tactics–they are to honestly characterize what I see in your posts. If that includes a moment of idiocy, then I call that moment idiotic. That you cannot perceive such a thing except as an inuendo about your intelligence is perhaps something which your proffession might illuminate for you, I cannot.
I agree. I also found it lacking in posts that charged me with rhetorical tricks while accusing me of repeatedly changing my position or treated my apology as an insult.
Regardless, the irony of your chiding me for lacking common courtesy in my recent posts is simply lovely. Thanks for the smile.
Ah, yes, it’s a wonder you haven’t been banned conveys that thought so very precisely. Still, our feelings are assymetrical in this. I have no issues with your style of posting, it is teh content of your posts to which I have objected.
The claim is not baseless. You asked me to provide 5 examples of scientific discipline in which instrumentation was never used aws a means of gathering data. Yet you claim this was not a reductio ad absurdum.
Your method of communication is interesting, as is your command of logic and rhetoric.
I have neither patronized nor condescended to you. Insincere agreement is patronizing. Witholding plain criticism as if a person is too fragile to have his ideas challenged is condescending. I consider an equal quite capable of hearing that he is wrong without feeling personally insulted. I have treated you as such, and will continue to do so regardless of the evidence in this thread.
Bully? Noting that you posted something idiotic is bullying. Got it. Oh, wait, I think that’s idiotic, too.
Damn, I guess I am a mean old message board bully.
Ah–the unconditional if statement–joy of forst order logic students everywhere.
As near as I can tell, I am the only person in this thread whose ideas you have consistently misrepresented. Of course, this exchange was introduced by a simple request on my part: please show when I introduced the subject of litmus paper as "irrefutable evidence for [your] idiocy."
It strikes me as not at all strange for you to fail to provide such evidence while continuing to retend that only I could perceive that as a misrepresentation. No doubt you see this as another of those rhetorical tactics which I have baselessly claimed that you apply.
Yes, and you take the high road on your way out, too.
For the record, I also consider the tactic of “insult and run” to be graceless and cowardly.
erl
Well, then a definition which emphasizes methodology alone as the demarking quality of science failes. Again, this has been my point all along.
Yes, if one considers particle physics to be a science and astrology (and music production) to not be sciences.
Depending on what definition we come up with, you might be!
I’m not sure there is a difference between these two statements. One way or another, doubt must be raised before contrary evidence convinces a person to alter his/her view, regardless of whether that doubt is a healthy, consistent skepticism or induced by an anomaly. A person who is not open to the possibility, however infinitesimal, of being wrong will never be convinced.
This brings me to the issues of doubt, uncertainty, and so on. My personal take – my philosophy of science – tells me that we can never really know anything with 100% certainty. Why? Because our human brains are finite, and it strikes me as arrogant, the worse kind of hubris, to assume that we humans must be right just because we can’t imagine how we could be wrong. So I approach my work with that kind of attitude. Maybe it comes from being an analytical chemist, being in the position of having to explain corroborating or conflicting evidence, doing my best to be certain that my chemist colleagues don’t go away with some completely off-base idea about what I’ve just told them about their results. And I have to do this quite a lot. If I had a dollar for everytime I was told by a synthetic chemist, “But I know this has to be the right stuff,” I wouldn’t need my job anymore. I used to think that this happened largely because, to non-specialists, mass spectrometry is a sort of black-box analytical technique, and it was much more comfortable for them to believe that mass spec gave them an absolute answer, or to ignore it entirely when it didn’t produce the expected results, than it was to actually understand the processes involved. Not surprising, though. Nobody can learn everything; that’s what specialists are for. . .
But then while discussing this with all y’all, I realized: maybe I run into this kind of “I know this is right” behavior so much because this is, in practice, the prevailing attitude. Maybe I am part of a small minority who still thinks that, no matter how many experiments you do to corroborate a hypothesis, there is always the tiniest chance that someday you will find evidence which disproves it. Maybe to many/most of my colleagues, the accumulation of a certain amount of supporting evidence in the end constitutes absolute proof. As several of you have pointed out, this is necessary to make any kind of progress. But there is a huge difference in treating the conclusion one draws from the accumulated evidence as an assumption vs. fact. It is the difference between recognizing one’s own limitations and ignoring them.
For instance, Isaac Newton formulated ideas which described the motions of physical objects. I would suppose that these ideas grew out of keen observations which were later put to the test. He probably tested these ideas under many different circumstances. No doubt he found no circumstances in which the ideas he had postulated were proved wrong. He could imagine no situation in which they did not apply. In the end, he formerly expressed these ideas as Laws of Motion. No matter how finely you care to split the falsifiability hair, for a long time these laws were taken, in practice, as incontrovertible fact because no one had been able to find, or imagine, a situation in which they did not seem to apply.
Along comes Einstein with his theory of relativity, which later acquired its own corroborating evidence. Now, it seems, Newton’s Laws of Motion are not carved in stone after all; there are cases in which they do not apply. But, because between Newton and Einstein no one had ever been able to conceive of how Newton’s laws might be wrong, it was assumed that they were right. And that assumption was itself wrong.
I am not Albert Einstein. Most of us are not. Why should I take something as an absolute, proven fact simply because so far no one has been able imagine how it could be wrong?
eris wondered about how “deep down” doubt is carried into research endeavors. At what point do we stop questioning the theory, hypothesis, assumption, idea and just take it as given, so that we can move on? In practice I think this doubt is shallow, so that in practice perhaps only lip-service is paid to the ideals of the Baconian method and related philosophies after all.
As I said before, most of my work involves testing the hypotheses of my fellow synthetic chemists, and in this work I depend heavily on instrumentation constructed around the theories and ideas on which mass spectrometry is based. The instrumentation is checked periodically to determine whether it is good working order, calibrated, etc. So if I obtain a result which is not consistent with the proposed molecule, my first thought is usually that the molecule is something else instead. However, it could be that the chemist obtains results from a different lab running different tests which are consistent with the proposed molecule – or better yet, the other results are consistent neither with her proposed molecule nor with my results. What now? I and the other analytical person put our heads together and try to figure out what happened. At this point the doubt descends one level. Maybe something unusual or unexpected happened during one experiment or the other, so that the molecule actually changed, or responded differently. Various possibilities are discussed, various new experiments carried out to avoid these possibilities or correct for them. It is usually on this level that apparent inconsistencies or anomalies are resolved.
What if they aren’t resolved? What is the next level of doubt to which we descend? More importantly, is there ever a point where the theoretical underpinnings of the discipline – mass spectrometry or whatever – are called into question? Perhaps, but not by me, in my line of work, for several reasons: (1) healthy self-doubt; when I encounter an anomaly in the data as described above, I am far more likely to consider, first, that I have made a mistake or overlooked some variable or did not check the instrument status or. . . you get the idea. In my mind, it is far more likely that I, individually, am wrong than that we chemists, collectively, are wrong. (2) I have neither the time, resources, mandate or, frankly, inclination to conduct research into the fundamental underpinnings of my discipline. So that even if I were to suspect that some underlying principle was flawed, for practical and logistical reasons I would not be in a position to pursue the matter further.
So while my philosophy of doubt runs deep, my practice of doubt reaches its limit rather quickly. If a problem ends up absorbing too much time – mine or the synthetic chemist’s – the hypothesis is usually abandoned and we go on to something else entirely. In applied chemistry at a pharmaceutical company, the bottom line is getting drugs to market, not conducting fundamental research. I make the necessary assumptions, compromises and choices that allow me/us to continue towards that goal. I still maintain my philosophy of doubt, though, even if I cannot always pursue it; others choose the more practical philosophy of equating a reasonable assumption with fact. For some it’s easier.
Why? Well, consider the logical consequences of my philosophy of doubt. If we can never know anything with 100% certainty, then we can never know whether the ideas/theories which we hold actually represent reality, or are simply a very good model or metaphor. It raises the possibility that the Universe only behaves as if the model were reality. This is a very unsettling idea. And I think that a lot of chemists – a lot of people in general – take greater comfort in the idea that we know, or can know, the structure of the Universe in some absolute way.
So in a way “science” does come down to belief: belief either in certainty or uncertainty.
If this rambles too much and makes no sense, I will not take it unkindly if someone tells me to shut up.
Also, in rereading this debate, I realize that I placed mathematics outside the realm of science. This was a reaction to my impressions based on the principle of falsifiability. I still feel that this is pretty much the case; that is, that mathematical theories aren’t really even superficially similar to particle physics’ or psychology’s theories. That the world may never have pure 5 dimensional objects is of no consequence, and that Klein bottles cannot exist except in pure tolopological constructs doesn’t matter. And, in fact, that finding or not-finding a Klein bottle would be of no consequence. Inconsistent mathematics—like set theories which allow paradoxical syntax like a Russel paradox—certainly didn’t topple mathematics at all. The calculus hasn’t changed, to my knowledge. This is because there is no standard by which we can judge mathematics, while at least with what I would call “normal” scientific inquiry we have the real world to deal with (or impressions of what we assume the real world is, to make a weaker claim).
Typical division between analytic and synthetic statements. I would not be the first, and definitely won’t be the last.
I can see why some might consider mathematics a science (and I have made statements in this thread to that effect), and I suppose I am more of a fence-sitter here than landing strictly with one camp or another. Like all things from food consumption to solubility determination, it can be handled scientifically. But words based on experimentation and prediction have very similar meanings across all of science until we get to mathematics where, without stretching interpretations, we cannot use the same terms and, in fact, some terms must be removed entirely or have their interpretations altered beyond recognition (“proof” for one obvious example). Now, I can see a simple, well-motivated attempt to create a context in which mathematical terms are shown to be synonymous to traditional scientific terms, but I think that is a bit of puzzle-piece smashing and duct-taping after the fact, not something that was really ever inherent in the two fields together, and besides which still requires that we remove the principle of falsifiability (allowing us to capture the use of the word “proof”). A tendency to group all academia together has yet to really leave common perception.
So, I would prefer to keep mathematics out of the realm of science. I’m sure that I would not be able to do this without some resistance from parties present. For the intents of this thread I think mathematics lends itself well to the application of being an instrument itself, and serves well to breach the gap between the senseless statements of mathematics and the application of these contextless statements to statements about the real world.
Now, during the construction of the above I find two replies. Figures!
Jerevan, sidenote… wouldn’t happen to be a happy employee of BMS, would you? Not that I wouldn’t understand if you failed to answer on a puclic message board, just seems that our fields quite possibly overlap, and, well, I was just in Wilmington a little bit ago at BMS. Thought it would be an interesting “small world” sort of thing.
But having absolute certainty means one is convinced that this is right now. It doesn’t mean that one is impervious to removing that certainty. I am absolutely certain my name is erislover on this board, but yet I can imagine a scenario and series of events which would convince me otherwise. Doesn’t mean that I doubt that I post under the name erislover. See? This also serves to address your “I ain’t arrogant” idea. It isn’t that I’m pompous enough to think that I must post under the name erislover, but that there is no room in my life for doubt of such a thing. Back to the Wittgenstein thing, here’s a perfect quote that I seem to be able to work into a lot of conversations (possibly too many, now that I think about it): I can’t be making a mistake;—but some day, rightly or wrongly, I may think I realize that I was not competent to judge. Do you think this is an expression of doubt?
Spiritus
Both the OP and I, I believe, have expressed that these are, in fact, sciences by any explicit and intuititive definition. I certainly have no problem with that.
Apart from that, considering something like astrology as “not a science” serves to undermine one of your beefs with the OP’s phrasing, for if it weren’t for astrology we wouldn’t have astronomy (AFAIK anyway), and so it must be a science. No?
Lib, First off, thank you for your kind words the other day. I take a compliment from you as high praise. Now, onto responding-
Yes, I think that (in general) “people who once believed but now don’t … lacked 'true faith”" … or more precisely, that the epistemology that was their religious belief system could not include them without continued faith, with doubt about the religion’s basic postulates, and thus they are no longer believers. Whereas the epistemology that is scientific inquiry allows for, nay mandates, some level of doubt. The postulates are accepted without proof, as in religion, but they are accepted only tentatively, for the duration that they continue to be consistent with available evidence. Does every religious person have absolute faith? No, of course not. But as a belief system, religion encourages it, places absolute faith as the ideal, and science instead encourages the regulation of doubt. (Please recognize that I personally am a theist. I believe in Right and Wrong as absolutes independent of whether or not we believe in them. At times, I regret that my personal faith in these beliefs is as weak as it is. And at other times I regret that I need to believe in such absolutes despite evidence to the contrary. But further discussion of these issues would be a major hijack.)
Jerevan, you, as usual make perfect sense. Perhaps some individuals forget to have doubt, but I believe that your philosophy is the nature of true scientific inquiry. “Scientists” without any doubt are not, IMHO, really scientists. Even if they produce good data. They are no longer trying to improve the metaphor and might as well be astrologers. Your Einstein example is well made; only when you had someone who had enough doubt about some basic beliefs did the science significantly progress in that fundamental way. But you also point out the practical side. Each of us knows only a small part of the picture. We do take for granted that the other pieces provided for us outside our area of knowledge are “true” to a reasonable standard of certainty. You tell me that the substance is A and I belive that that is very likely the case. I’ve got enough doubt to regulate with what’s on my own plate.
Mr. S., perhaps an example from an area of your background will suffice: your background was cultural anthropolgy, correct? Can cultural anthropology be studied scientifically? Can you make observations without instrumental measures and make hypotheses about the culture based on those observations that guide your future data collection. Hypotheses that might even be falsifiable? Is it suddenly more scientific if you created a machine to make to make the same measurements with the exact same reliabilty as human observers?
But the great thing about instruments is that they are considered more reliable than human observers. This is why they take the measurements, and we read their output. This is why the trend in instrument development is to make software to control the instruments, and software to interpret the data.
Do you only have a “really high level of confidence” that computers work? In what way does this doubt manifest itself?
ding-ding-ding-ding That’s correct! Tell eris what she’s won, Bob!
Yes, I am now part of that which is part of BMS. We have been assimilated. Resistance was futile. We should talk more about this off-board.
Yes, I do see the distinction you’re making. But I still think that to say at some point in the future I may realize that I am wrong about now expresses doubt about the present, however qualified. And there is a distinction between what one thinks is true (on this board my name is JS) and what one does with that “fact” (I post under the name JS). Not unlike the distinction you and I were making in That Other Other Thread, between the action one causes, however unintentionally, and what one does about it… Anyway, yes, your perspective here does appear to resolve certain issues, but I need time to ponder it. I can’t quite buy into it just yet. But I may be wrong about that.
If you’ve ever used Windows, how can you even ask these questions? Seriously, though. Yes, computers are wonderful in many respects, especially in my field. With advances size and speed over the last ten years, computers now run very complex instrumentation very handily and allow us to automate certain routine aspects of our work which, previously, we had to do by hand. However, computers aren’t flawless. They make mistakes, they lose communications with the instrumentation, they crash. Doubt therefore manifests itself as not assuming that the computer will execute this task or this experiment flawlessly; take steps to ensure that the system is in good working order first, and don’t commit six months worth of work (chemistry) to a single experiment where it may go down the drain (or out the instrument exhaust).
Exactly, excellent point. It’s more a matter of trust in someone else’s method than anything. That is why I try to avoid speaking in absolutes at work, because I don’t have the whole picture about molecule A either. I especially like your concept of the regulation of doubt expressed in your response to Libertarian.
Well, speaking of work, I suppose I should go there now…
Thanks, DSeid. I think I understand your position now. Reading Jerevan’s and Eris’s discussion, there doesn’t seem to be a consensus on the matter of doubt in science. I agree with those who say that a scientist ought to be skeptical about scientific matters. That is NOT to say that he (or anyone) ought to be categorically skeptical. Try explaining to your lover why you doubt her love, and you’ll see what I mean.
erl
Actually, the OP tried to eliminate astrology from inclusion in his definition. Neither of you has, to my recollection, explicitely taken a stance on whether recording music is a science. For myself, neither fits my intuitive definition of science, and I therefore find any explicit definition of science which includes them to be flawed.
No. While I am open to the consideration of some early astrological work as scienctific, that is not the same thing. The person mapping planetary motions in 300BC and the person writing drivel in today’s paper may both be labeled “astrologist”, but they are not doing the same thing.
Spiritus… is recording music a science? This is a difficult question to answer, because if we say “no” then we are left asking what distinguishes this work from a typical analytical chemist doing a shake-flask titration (add some base, shake for 24 hours, measure, add some more base, shake for 24 hours, measure… etc). Both are really collecting what could be considered data that could falsify a theory; neither are intending to falsify a theory.
If we answer yes then we open the door to all sorts of nonsense regarding science, and the term starts to become meaningless.
But if all it takes to make a sound engineer a scientist is the continuing doubt that the equipment he uses really records sound, well, I think that is a rather elitist way to judge scientific work. Am I not a scientist simply because I do feel that some of our inquiries have led to truth (if only in a limited context)? Am I not a scientist simply because I don’t have the perspicacity to create a full-fledged theory? Am I not a scientist becuase I expect to find consistent data from a particle collision? Popper’s scientist should never be surprised by strange, unexplainable data. He should expect it. And yet this is almost never what we see, apart from perhaps a few exceptions (none of which I can think of, but there are probably some throughout history).
Resistance was futile, indeed. Glad you still have work. I know there was a bit of an exodus…
But the key part is left out! I may think I realize… etc. In other words, I could become deluded in the future! Of course, in the future I could say I was deluded in the past. Neither really affects the mark of absolute certainty in something.
Do you doubt these things:[ul][li]Your gender[]your name[]your birthplace[]that you do not surreptitiously travel to the moon while you are asleep at night, only to be returned to earth the next morning just before you wake up[]that everyone, in fact, knows that you surreptitiously travel to the moon each night and are part of a plot to cover it upthat you do not, in fact, possess a hand[/ul][/li]Each of those is open to scientific inquiry by modernized theories, except perhaps one which is still open to empirical investigation alone. If you say that you do doubt these things, that you don’t have absolute confidence in them (or absolute confidence in their falsity, whatever the case may be for each individual one since I doubt you’d feel they were all true, even if not absolutely true—), then there is little more I can say. These things I simply do not doubt. I remain open to the possibility that they may be false (or true, as the case may be), but I do not think they are. There isn’t a shred of doubt in my mind for I’ve found no reason to have any. I should dare say if my perspective on any of those changed my entire worldview would need some serious overhauling. Those propositions, while pretty much empirical, are statements whose truth I can be certain of.
Jerevan:
**Well, I wasn’t really relying on Popper’s definition in my response, per se, but rather pointing out a weakness with trying to “rescue” Popper’s criterion, as it were, by expanding it to include “potential” falsifiability. As a technique, falsification is extremely useful, but as a criterion of demarcation, much less so. In fact, my idea for starting this thread in the first place was to see if I could “falsify” my definition (falsification as technique), and as has been noted elsewhere, this ain’t exactly rocket science (falsification as a criterion).
Yeah, after I posted that last response, I began to suspect that I was missing your point.
Regarding your history of the thermometer:
First, the good news: Thanks for presenting a clear, detailed example of the way in which you view the connection between measurement and perception.
Now, the bad news: I’m not completely convinced by the historical narrative you present regarding the actual development of the Kelvin scale. I realize that it was necessarily simplified for various reasons, but in addition, I feel that it glosses over some difficulties that, in turn, once pointed out, will hopefully clarify my thinking on the question.
To begin with, your explanation of how the Kelvin scale came into being reads like something one would find in a standard introductory chemistry textbook, such as a high school or undergraduate level text on the subject. That’s all well and good, but the point of such texts is to teach chemistry, not the history of chemistry or the history and development of instruments of measure. So I have to ask, did Kelvin actually build his thermometer like that? Because there is at least one other alternative: he took a standard Fahrenheit thermometer and simply recalibrated it so that 32[sup]0[/sup] was marked as O[sup]0[/sup], and 212[sup]0[/sup] was marked as 100[sup]0[/sup]. That’s what I would have done, rather than, as it were, reinvent the wheel.
Anyway the real problem with your narrative, IMHO, starts here:
Actually, in one sense, this is a great description. But I’ll get back to that in a minute – first, the critique. It appears that you seem to think that the freezing and boiling points of water are straightforward, unequivocal quantities that are easy to decide upon: they present themselves to our senses, in other words, in a totally direct and unequivocal manner. This might be the case, in fact, but for the sake of my argument, I propose an alternative view: namely, that these quantities are not so easily discerned, and that the markings on the thermometer, at least to a very large extent, came about as the result of a group of scientists arbitrarily deciding, on a consensual basis, that, about here, approximately, water freezes – and that about here, approximately, water boils.
When do we consider water to be at freezing temperature? Do we place the thermometer in a block of ice? How long do we wait before we take it out? Or do we mean that water freezes at just that point when a body of water of a certain size, at a certain temperature, begins to develop ice crystals? How do we control for possible convection currents? And, finally, how do we account for the fact that almost invariably, every time we take a measurement, we get a slightly different reading?
When is water “boiling?” Is it when the first small bubbles appear at the bottom of the pan? If so, why just then? Why not wait until it’s at a roaring boil? Etc.
As with my solar neutrino example, it is these sorts of very basic, not-easily-answered questions that lead on to various scientists disputing various results, and arguing about it in various specialist journals, until finally a consensus is reached. All of which constitutes the almost hidden social side of the Scientific Project.
These questions reflect back on another, deeper issue, which I characterize, somewhat clumsily, as the question of “foreground vs. background.”
This occurred to me when I was composing a response to Spiritus, above:
It suddenly struck me that our differences were related to the question of how we defined what acted as background, and what acted as foreground, in the process of measuring. Spiritus (I think, if I have understood him correctly), defines the background as the Object, i.e., Nature; and so do you, if I’ve followed your reasoning correctly. You argue that Nature forms the stable background from which instruments derive their measurements, and your history of the development of the Kelvin scale is designed to indicate how you think that process works, and thus exemplify your underlying thesis. This is also the reason why Spiritus rejected my Og story, I suspect; as he wrote:
From my perspective, however (if I were to be categorical), I would state the opposite: measurement is the background, as it were, from which we derive our picture of Nature. Thus, all measurements are essentially arbitrary and in fact socially agreed upon. That’s why in my story of Og the caveman, his yea stick was arbitrarily long, rather than say, measured against some other length (like his own height, for example). Once Og had decided, in this case extremely arbitrarily, on a measure, he could then use it to quantify distances that otherwise, as Objects, were directly experienced by him as equivocal and uncertain. Much like your story, mine also contains embedded within it the presuppositions of my position.
By the way, I would just like to point out that even if I had decided to let Og give his yea stick a specific length, I do not think that any of us would classify the work that went into the making of the stick as “scientific.” Og wants a stick the length of his own body; he cuts one that long. Etc. Thus, arguably, the actual work done in developing the Kelvin scale, which really involved only the construction of device much like Og’s stick – although much more complicated, naturally – might not be such a good example of scientific work, either.
Even though this is shamefully incorrect, just for the sake of shorthand I’m going to label your view, i.e., that Nature is the stable backdrop, and measuring instruments the foreground, as positivistic. ( Apparently, I love to label things here.) I’m going to refer to the second point of view, in which constructed instruments form the background, cultural relativism.
If I were a positivist, I would argue: “Our experience of Nature is direct and unequivocal. Our measurements are derived from this direct and unequivocal apprehension of Nature, and are thus a reflection of it – not the other way round.”
If I were a cultural relativist, I would argue: “Our experience of nature is variable and equivocal. Our measurements are arbitrary, socially constructed standards that allow us to standardize our variable experiences, and create the illusion that they are unequivocal.”
Being a Gemini, and thus always of two minds (not that I believe in astrology, mind you), I would straddle the fence and say: “Nature seems to be constant, even though our experiences of her are often variable; measurements are arbitrarily, socially agreed-upon standards constructed in a dialogue with Nature,” or something along those lines.
What I liked about your description, above, was the way you indicated that heat, a specific tactile sensation, was “translated” into a visual measure. This is really my point; when we read a thermometer, we’re translating a global, variable, and diffuse tactile experience into a measured, visually presented quantity. Especially in this case, it seems clear to me that we articulate the experience of heat in terms of an agreed-upon measurement; and in order to do this, we need to agree upon the measurement first. I have been living in Sweden for 12 years, but it is only recently that I have begun to understand, at a bodily level, what the temperature gages over here imply: I know immediately, if somewhat approximately, how warm it is when you say, “It’s 73[sup]0[/sup] degrees out today;” but if my girlfriend were to say to me 5 years ago, “It’s 17 degrees out today,” I wouldn’t have the foggiest clue how warm or cold it was outside. I’d know, basically, that it was somewhere above freezing. Don’t know if that really makes the point I’m striving towards, but….
I could probably write volumes more about this, but the post is already overlong and I have others to respond to. Hope this provides some more food for thought.
Well, I think you outline the difference between our two perspective fairly clearly. I accept that, as a logical consequence of my earlier statements, I must say to each thing in your list, “No, I do not have absolute, 100% certainty about that.” (Partly because I have seen the X-Files and I know what goes on! ) I think the key difference may be in how each of us is using the word certainty. If I may, my impression is that you mean “there is enough proof that I see no reason to think that it is not so”, while to me it means, “there is enough proof that I (must) accept that this is the absolute reality.” Both are perfectly valid definitions/applications of the idea of certainty; your logically allows for a change in perception without violating the definition while mine, it seems, does not. If I have accepted that this is reality, then logically how can I say in the future, “I now think that reality is different” ? Now, to take my point of view to absurd (key word) lengths:
(1) my gender. To my knowledge I have never had a genetic test, so a key piece of evidence in demonstrating this is still missing.
(2) my name. Well, turn it around: how do I prove what I think my name is?
(3) since I do not have a conscious memory of where I was born, it is possible that what I think I know about this (or have been told) is wrong.
(4) & (5) Unlikely but I can’t say that this has never occurred. See X-Files remark above. And at this point it can’t be proved, because the past opportunities to investigate this thoroughly are gone and no longer open to rigorous review.
(6) I possess what I conceptualize as a hand. This is not the same as saying my hand really exists.
I realize this sounds ridiculous and you’re probably beginning to think my mind is so open that my brain’s gonna roll out any minute now. I admit, in practice I generally conduct my life as if those things are true and don’t go out of my way to find (dis)proof. But there is a difference between accepting that the model of reality works, and believing that you actually know what reality is. I guess that tiny, tiny shred of doubt is always there, lurking in the back of my mind.
Plus, the things you mention are things which relate, at least in part, to my subjective experiences. That doesn’t close them to doubt, but it does make them less uncertain to me, which is not the same as saying they are real to other people. In studying the natural/physical world, or other people’s subjective realities, my doubt is higher. But do keep in mind, I’m still working through the implications myself, and may not be explaining my thinking too clearly at this point.
And of course, this irony struck me a bit earlier today: “How can I be certain that nothing is certain?”
A thing being possible does not make it necessary.
Then, it is possible than nothing is certain. Whew! That was close.
Thanks, Lib.