Friday, December 31, 2010

From the 2010 APA in Boston: Neuropsychology and ethics

This session featured a single speaker, Joshua Greene from Harvard, known for his research on "neuroethics," the neurological underpinnings of ethical decision making in humans. The title of Greene's talk was "Beyond point-and-shoot morality: why cognitive neuroscience matters for ethics."
Greene started out actually acknowledging that there is a pretty strong line separating is and ought, but he contended that there are important points of contact, particularly when it comes to evaluating moral intuitions. Still, he was clear that neither neuroscience nor experimental philosophy will solve ethical problems.
What Greene is interested in is to find out to what factors moral judgment is sensitive to, and whether it is sensitive to the relevant factors. He presented his dual process theory of morality. In this respect, he proposed an analogy with a camera. Cameras have automatic (point and shoot) settings as well as manual controls. The first mode is good enough for most purposes, the second allows the user to fine tune the settings more carefully. The two modes allow for a nice combination of efficiency and flexibility.
The idea is that the human brain also has two modes, a set of efficient automatic responses and a manual mode that makes us more flexible in response to non standard situations. The non moral example is our response to potential threats. Here the amygdala is very fast and efficient at focusing on potential threats (e.g., the outline of eyes in the dark), even when there actually is no threat (it's a controlled experiment in a lab, no lurking predator around).
Delayed gratification illustrates the interaction between the two modes. The brain is attracted by immediate rewards, no matter what kind. However, when larger rewards are eventually going to become available, other parts of the brain come into play to override (sometimes) the immediate urge.
When it comes to moral judgment, Greene's research shows that our automatic setting is "Kantian," meaning that our intuitive responses are deontological, rule driven. The manual setting, on the other hand, tends to be more utilitarian / consequentialist. Accordingly, the first mode involves emotional areas of the brain, the second one involves more cognitive areas.
The evidence comes from the (in)famous trolley dilemma and it's many variations. I will not detail the experiments here, since they are well known. The short version is that when people refuse to intervene in the footbridge (as opposed to the lever) version of the dilemma, they do so because of a strong emotional response, which contradicts the otherwise utilitarian calculus they make when considering the lever version.
Interestingly, psychopaths turn out to be more utilitarian than normal subjects - presumably not because consequentialism is inherently pathological, but because their emotional responses are stunted. Mood also affects the results, with people exposed to comedy (to enhance mood), for instance, more likely to say that it is okay to push the guy off the footbridge.
In a more recent experiment, subjects were asked to say which action carried the better consequences, which made them feel worse, and which was overall morally acceptable. The idea was to separate the cognitive, emotional and integrative aspects of moral decision making. Predictably, activity in the amygdala correlated with deontological judgment, activity in more cognitive areas was associated with utilitarianism, and different brain regions became involved in integrating the two.
Another recent experiment used visual vs. verbal descriptions of moral dilemmas. Turns out that more visual people tend to behave emotionally / deontologically, while more verbal people are more utilitarian.
Also, studies show that interfering with moral judgment by engaging subjects with a cognitive task slows down (though it does not reverse) utilitarian judgment, but has no effect on deontological judgment. Again, in agreement with the conclusion that the first type of modality is the result of cognition, the latter of emotion.
Nice to know, by the way, that when experimenters controlled for "real world expectations" that people have about trolleys, or when they used more realistic scenarios than trolleys and bridges, the results don't vary. In other words, trolley thought experiments are actually informative, contrary to popular criticisms.
What factors affect people's decision making in moral judgment? The main one is proximity, with people feeling much stronger obligations if they are present to the event posing the dilemma, or even relatively near (a disaster happens in a nearby country), as opposed to when they are far (a country on the other side of the world).
Greene's general conclusion is that neuroscience matters to ethics because it reveals the hidden mechanisms of human moral decision making. However, he says this is interesting to philosophers because it may lead to question ethical theories that are implicitly or explicitly based on such judgments. But neither philosophical deontology nor consequentialism are in fact based on common moral judgments, seems to me. They are the result of explicit analysis. (Though Greene raises the possibility that some philosophers engage in rationalizing, rather than reason, as in Kant's famously convoluted idea that masturbation is wrong because one is using oneself as a mean to an end...)
Of course this is not to say that understanding moral decision making in humans isn't interesting or in fact even helpful in real life cases. An example of the latter is the common moral condemnation of incest, which is an emotional reaction that probably evolved to avoid genetically diseased offspring. It follows that science can tell us that three is nothing morally wrong in cases of incest when precautions have been taken to avoid pregnancy (and assuming psychological reactions are also accounted for). Greene puts this in terms of science helping us to transform difficult ought questions into easier ought questions.
Personal question at the end of all this: if emotional ethical judgment is "deontological," and cognitive judgment is utilitarian, could it be that the integration of the two brings us closer to behave in a way consistent with virtue ethics? Something to ponder, methinks.

Wednesday, December 29, 2010

From the 2010 APA in Boston: Teleological thinking in scientific explanations

The first talk of this session was by Devin Henry, Western Ontario. Plato and Aristotle's accounts of teleology is seen in the light of the concept of optimization. In the Phaedo Socrates says that we need to inquire into what is the best way for things to be, a research program stemming from the idea that the universe was put together by a mind aiming at what is best (because that mind is supremely good). The universe is the way it is by necessity, because that is the best way for things to be. Finding that necessity explains a given phenomenon.
This idea is seen by the author as the ancestor of Aristotle's ideas on the subject, including that nature does nothing in vain. It also follows that being the best is in accordance to nature. However, there are important differences between Plato and Aristotle. For instance, Socrates makes his argument at the cosmological level, the good is the good of the whole cosmos, not of individuals (indeed, the other way around, individuals are for the good of the cosmos). Aristotle doesn't invoke a cosmological principle, what is good for the organism is good for it, not for the broader context of the cosmos.
A second difference is that Plato clearly speaks of an intelligent designer. While Aristotle's language is full of design talk, his personification of nature is only metaphorical, like Darwin's. Aristotle's form of teleology is seen in his analysis of why snakes do not have legs. Nature does nothing in vain while doing the best for the organism: if the length of a snake is a built in feature, and if no blooded animal can move with more than four points of leverage (as Aristotle thought), then having no legs is better than having some legs (as a centipede type solution wouldn’t work for bloodied animals).
Aristotle even criticized what today we would label a Panglossian view of the world: things are the best they can be, not the best they can conceived to be. (Again, close to the conception of constraints by modern biologists, the author citing the Gould & Lewontin paper on spandrels.) So Aristotle's concept of teleology is based on optimality, not perfection.
In his analysis of male testis, for instance, Aristotle claims that we need to understand the function of the organ in order to understand its form. Again, a remarkably modern sounding connection between form and function. Aristotle was aware that some species of animals (fish) don't have male testis, which means that testis cannot be essential for reproduction, and yet somehow have to make it better in the animals in which they are present. (Aristotle's specific explanation, that testis slow down sperm production, is not the correct one, of course, but the idea is still guiding functional biology today.)
The second talk was by Jeffrey McDonough, Harvard. A teleological explanation purports to explain something in terms of its outcome. In ancient and early medieval periods the range of teleological explanations was broad, including not just rational beings, but living beings more generally, and even features of the cosmos at large.
In Plato, as well as for Augustine and Aquinas, goodness is prior to being: the universe exists because it is good, it isn't good as one consequence of existing. So goodness figures into explanations of why things are. Also, in this view, teleological explanations are just as appropriate, if not better, than efficient explanations.
This ancient view, however, seemed to commit one to some sort of moral necessitarianism, where god simply has to do what is good, in contradiction with the classic Christian view of divine agency. In later medieval and early modern views, from Scotus to Boyle to Descartes, we see the concept of a libertarian will, where one could choose something that is not best. This means, however, that one can no longer explain what the agent does by considering the outcome. It is the will's efficient decision that becomes central to explanation.
This quickly led to philosophers giving up teleological explanations (final causes) for anything that is not a rational agent (god, angels, and human beings). Hence a mechanistic view of anything that is not a rational agent, a la Descartes.
In more modern times, Spinoza is considered the ultimate enemy of teleology and final causes, again, however, with the exception of rational agents. However, Spinoza was also a naturalist, and it becomes difficult to justify limiting teleology only to a particular subset of natural entities. Accordingly, for him there is no sharp distinction between rational and non rational agents. Spinoza also rejected the idea of objective goodness, which means that one cannot invoke goodness as explanatory. For Spinoza we do not strive toward certain things because we think them valuable, but on the contrary we think certain things valuable because we happen (by our nature) to want them.
Leibniz, on the other hand, presented himself as a strong defender of teleology, in important ways arching back to the Greeks. God here does things because they are good, but god has to consider total goodness, and so chooses whatever maximizes good overall, and may not necessarily be individually good. Leibniz therefore opens again himself to the problem of moral determinism (for finite agents) and moral necessitarianism (for god). Hence some of his compatibilist maneuvering when it comes to free will.
Overall, it seems to me that this session was badly titled, as neither talk (and particularly the second one!) had much to do with scientific explanations, certainly not in the modern sense of the term. Oh well.

From the 2010 APA in Boston: Social networking and philosophy

The APA meeting in Boston is turning into a disaster because of the weather: many sessions have been canceled because speakers couldn’t get to this frozen hell, while other sessions are being run by substitute speakers gathered at the last minute, with some presenting talks that only have a vague connection to whatever it was that the original session was supposed to be about.
This particular session was billed as having to do with how Twitter is changing the connectedness of philosophical communities, but turned out to be about social networking more broadly. Neither of the original speakers was present, and neither of the two replacement talks was about Twitter specifically. Oh well.
The first speaker was Casey Haskins (SUNY Purchase), who announced that he was going to talk about aesthetics and interconnected communities (though eventually aesthetics didn’t really make much of an appearance, probably a good thing). I find it amazing that someone gives a talk about Twitter, Facebook, and RSS feeds but freely admits that he doesn't know much about and has in fact just started using them.
"Small worlds" (the term Haskins uses for social networks) can be thought of as analogous to biological ecosystems that exchange information instead of organic materials. They are media that allow our "extended minds."
The guy was all over the place, using the term "good ideas" to talk about things ranging from Twitter to the evolution of coral reefs (apparently, nature can have ideas too, though what determines whether Twitter and corals are "good" isn't clear).
Reefs are then conceptualized as "platforms," apparently in the engineering sense (like Twitter!), structures that make it possible for other things to happen. He suggests an analogy between information flow on Twitter and material flow in biological ecosystems. I couldn’t be more unconvinced.
Twitter is presented as a "cultural exaptation" of a more primitive text based system (hence the 140 characters limit). Of course this is an example of an intentional exaptation, and hence yet another disanalogy with biology. I wonder what’s up with some philosophers’ biology envy. Someone should do a sociological study on this.
The second co-opted speaker was Saray Ayala (Universitat Autonoma de Barcelona). She talked about whether the computational theory of mind accounts for the “extended mind” (again!) made possible by environmental inputs, including social networks. Social networks (as environmental structures) may impose constraints on the functioning of our minds, and some of these constraints may not be computable.
She brings up an interesting example of robots that literally "embody" the ability of carrying out simple computations, by virtue of the way they are physically put together. A particular morphology of the robot plays the role of the hidden layer in a three-way layer system producing a logical XOR function (the other two layers being the input and the output).
The author then suggests that a computational theory of mind does not explain the environmental contribution of social networks to mind, because the theory treats the environment as background, passive with respect to computation, as opposed to as a structural component of what the mind does. Well, I’m not too sympathetic to computational theories of mind anyway, so I’ll need to look into this.

Friday, November 19, 2010

A Query (or two) on Coherence

Two formal theories of coherence are currently available. The probabilistic views have suffered heavy criticism, partly because of their inability to capture explanatory relations, which seem to be at the heart of coherence. Thagard's model of explanatory coherence fairs better here. But it is not clear whether the coherence that Thagard describes so well is a good measure of the short-term reliability of a scientific claim. It captures well episodes in the history of science, but that could be more because of the fecundity of coherent views (which helps produce long term success) rather than short term reliability.

So, here is the query: 1) Are there any other contenders for a theory of coherence out there? and 2) Do we have any reason to think that a more coherent (however construed) view is more reliable right now?

Monday, November 8, 2010

Conference: Evolution, Cooperation and Rationality (Bristol, June 2011)

An international conference at the University of Bristol, June 27th-29th, 2011

The conference forms part of the AHRC-funded project on Evolution, Cooperation and Rationality, based in the Department of Philosophy at the University of Bristol, under the direction of Samir Okasha and Ken Binmore. The aim of this inter-disciplinary project is to study the connections between evolutionary theory and rational choice theory. The first project conference, held in September 2009, explored the different theoretical approaches to decision-making and social behaviour used in biology, economics, and psychology.

This conference is a sister to our 2009 conference, but with a more philosophical focus. The aim is to explore the philosophical foundations of recent scientific work on co-operation and social behaviour, in both human and non-human animals.

Confirmed Speakers:
Elliott Sober, Peter Godfrey-Smith, Kim Sterelny, Samir Okasha, Ken Binmore, David Papineau, Cedric Paternotte, Jonathan Grose

Papers will be both contributed and invited. For further details, including information on how to submit a paper, please see our conference website:

Saturday, October 23, 2010

CFP "More Too Funky Causation" (Funky III), February 23-24, 2011, Ghent.

The Department of Philosophy and Moral Sciences, Ghent University, Belgium, is proud to announce a call for papers for:
"More Too Funky Causation" (Funky III), February 23-24, 2011.
Keynote speaker is Jeffrey K. McDonough (Harvard): "Leibniz on Agency and Optimal Form"
The conference is the third to explore *funky* notions of causation in historical perspective.
'Funky' causes are defined negatively as those notions of causation that are neither final nor (Humean) efficient causation.
We welcome paper proposals that explore a funky cause in depth. Topics need not be limited to Early Modern topics or figures,
but we would especially welcome papers on formal causation.
Abstracts (no more than 500 words) prepared for blind review should be mailed to Eric Schliesser ( by December 1. Inquiries can be directed to same address.

Friday, October 22, 2010

Ravello meeting on Chance and Necessity, part III (last one)

I am at the Ravello meeting on Chance and Necessity in biology, on the 40th anniversary of Jacques Monod's seminal book, and will be posting a few entries while the meeting is going on this week.
The gathering is organized by Giorgio Bernardi, sponsored by International Union of Biological Sciences and Istituto Italiano di Studi Filosofici.
What follows are the raw and somewhat selective notes only, in order of presentation of the various speakers. Hopefully this will provide a feeling for what the meeting is about and generate some discussion. Throughout, parenthetical comments are my own, unless otherwise noted.
Denis Duboule, Constraints (necessity) and flexibility (chance) in the evolution of vertebrate morphologies.
Across vertebrates the structure of proximal bones is strongly constrained, while there is a lot of variation in distal structures, like the number and shape of digits. This pattern appears to be related to the pattern of deployment of a cluster of Hox genes during the development of vertebrate limbs. It is the differential regulation of distal Hox that generates the type of phenotypic variation that shows up in evolution. The reason the proximal pattern of the limb is much more constrained is because its regulation has been co-opted from the trunk, and the latter is obviously resistant to evolutionary change. (Nice and elegant explanation.) There are exceptions, like limbless lizards and snakes. But in those cases, obviously, you do also observe dramatic changes in the trunk. There is a similar reason why tetrapods cannot have symmetrical limbs: the developmental genes that cause the asymmetry are co-opted from the trunk, and changing the pattern would affect the trunk in inviable ways.
Walter Gehring, Chance and necessity in eye evolution.
Jacques Monod compared the eye to the camera to highlight both the similarities, as in the relation between form and function, and the difference between teleonomy - for the eye - and teleology - for the camera. Monod's insight is confirmed by modern research on how genetic control is deployed during the development of the eye: the observed patterns are clearly not the sort of thing that an engineer would put in place, but are instead the kind of hodgepodge that results from sequentially overlapping historical events. Eyes of vertebrates, insects, Cephalopoda, and other invertebrates have been thought as non homologous because they are morphologically different and because they develop differently. Molecular biology however shows a "deep homology" in the fact that all these eyes are affected by different version of the Pax6 gene. (But of course that raises the thorny question of the degree of congruence of homology at different levels: is the genetic one more fundamental than the developmental one? On what grounds?)
Takashi Gojobori, Chance and necessity in the evolution of connections between sensory and nervous systems.
Starts out with gene expression in Planaria brains, the most primitive of all structures that we recognize as brains. Turns out that half of known Planarian genes expressed in the head are shared by humans. Next, what about Hydra, which does not have a central neural system, just a diffuse nerve set? Again, half of the relevant genes are also expressed in human nerve cells. What about sea urchins, which have lost a central nervous system? Sure enough, gene expression patterns show that the arm of sea urchin larvae are degenerated from an ancestral more fully developed nervous system. Looking for connection between sensory and nervous systems back in the Hydra, because of the simplicity of their nervous system. Focus on gap junctions as precursors of fully formed sensory-nervous connections. (Once again, not much here about Monod, chance or necessity, but it’s near the end of the meeting...)

Ravello meeting on Chance and Necessity, part II

I am at the Ravello meeting on Chance and Necessity in biology, on the 40th anniversary of Jacques Monod's seminal book, and will be posting a few entries while the meeting is going on this week.
The gathering is organized by Giorgio Bernardi, sponsored by International Union of Biological Sciences and Istituto Italiano di Studi Filosofici.
What follows are the raw and somewhat selective notes only, in order of presentation of the various speakers. Hopefully this will provide a feeling for what the meeting is about and generate some discussion. Throughout, parenthetical comments are my own, unless otherwise noted.
Werner Arber, Contingency of spontaneous genetic variation.
Talk started with a (somewhat peculiar) historical overview leading from Darwinism and the Modern Synthesis, through Watson and Crick and genomics, to a broader "synthesis" concerning molecular evolution. Different definitions of mutation if issue considered from phenotypic or molecular perspective, of course (just like the definition of gene itself). (Long-winded) introduction covering basics of molecular genetics. (Not clear at all what the point of this was, other than giving us a quick molecular genetics 101.)
Masatoshi Nei, Hugo de Vries and species formation: new perspectives from recent genomic data.
de Vries was famous for his experiments on mutations in Oenothera plants (and for contributing to the rediscovery of Mendel's work). These mutations were soon shown to be the result of chromosomal rearrangements and abnormalities, as opposed to the sort of point mutations discovered at the time by Morgan in Drosophila. Stebbins referred to de Vries' mutationist theory as a figment of imagination, even though polyploidy is very common in plants and other groups (this can't be right, Stebbins was well aware of polyploidy and it's role in speciation). Modern molecular genetics suggests that following genomic duplication there is a reduction in gene numbers that leads to incompatibility and speciation. (Lots of refs to Nei's own work on hybrid sterility back from the '70s and '80s.) Nei doesn't like Coyne and Orr's critique, in 2004, of his neutral model of hybrid speciation, proposed in 1983, suggesting that neutral models are under appreciated. (On this one I think Coyne and Orr were correct, actually.) (Overall, Nei seemed to want to significantly scale down the evolutionary importance of selection in favor of mutation, though I don't think his arguments were very coherent.)
Eviatar Nevo, Stress and evolution at micro- and macro- scales.
Importance of a variety of environmental stresses as major drivers of adaptive phenotypic evolution. (This has been a theme of Nevo for decades now.) Documented differences between, for instance, underground and above ground mammals, range across morphology, behavior, and even fine aspects of physiology. No question that life style drives phenotypic evolution. Evidence for a positive relationship between genetic diversity and levels of environmental stress. Similarly, indices of sexual activity, as opposed to asexual reproduction, increase with stress. (This morning we've steered pretty clear from Monod, chance and necessity. Hopefully better this afternoon, judging from the titles.)
Eugene Koonin, The role of extremely rare events in the evolution of life.
Major transitions in evolution are examples of extremely rare events and how important they can be, e.g., origin of life, nucleotides, cells, eukaryotes, or multicellularity. How do we explain the origin of replication and translation processes? Neither natural selection nor exaptation are adequate since both processes require replication and translation to get started. One popular answer is the RNA world type scenarios. However, known RNA replicases are ligases, not polymerases. (Somehow) the answer is related to inflation in cosmology... Which leads to a multiverse with island universes, of which ours is one, and in which the big bang becomes a local event... (Apparently) this is relevant because the number of times a given macroscopic history is repeated in an island universe is infinite. (Voila, by epistemological sleight of hand we solved the problem!) So anthropic (so called) selection would have preceded Darwinian selection.
Tomoko Otha, Near-neutrality, robustness and epigenetics.
Starts with brief history of neutral and near-neutral theories of molecular evolution. Neutral theory predicts that rate of evolution is same as rate of neutral mutation; near-neutral theory predicts rate of evolution to be inverse to population size. Much recent comparative genomic data compatible with near-neutral expectations. Robustness of gene networks made possible by near neutrality (this agrees with work by both A. Wagner and S. Gavrilets.) While robustness implies that many genotypes can result in the same phenotype, epigenetics results in the opposite: many phenotypes can be produced by the same genotype. (Not entirely clear what the role of epigenetics was here, but I take Otha to imply that it increases the range of near-neutrality as a theory of molecular evolution.)
Giorgio Bernardi, The neo-selectionist theory of evolution.
The two major determinants of gene expression are cis factors and chromatin structure. Lots of stats followed about the differential abundance of the various classes of DNA trinucleotides in the human genome. Selection favors certain types of chromatin structure in vertebrates, namely those that stabilize the thermodynamic properties of the chromatin itself. Indeed, patterns concerning the distribution of GC-rich chromatin is conserved across a hundred million years of mammalian evolution. (Not clear why this is “neo-selectionist,” however.)

Thursday, October 21, 2010

Ravello meeting on Chance and Necessity

I am at the Ravello meeting on Chance and Necessity in biology, on the 40th anniversary of Jacques Monod's seminal book by the same title, and will be posting a few entries while the meeting is going on this week. The gathering is organized by Giorgio Bernardi and sponsored by International Union of Biological Sciences and the Istituto Italiano di Studi Filosofici.
What follows are the raw and somewhat selective notes only, in order of presentation of the various speakers. Hopefully this will provide a feeling for what the meeting is about and generate some discussion. Throughout, parenthetical comments are my own, unless otherwise noted.
Agnes Ullman, In memoriam of Jacques Monod.
Monod was prominent in the critique of Lysenko and his brand of anti-scientific ideology. Shown charming early photos and even family drawings of young Jacques. I Did not know that Monod early on almost turned to a career as orchestra director before concentrating full time on genetics. He was active in the French resistance during WWII as a chief, a dangerous position that had cost three of his predecessors their lives. After WWII Monod immersed himself in the work on bacterial protein regulation that resulted in his Nobel in 1965. The latter was made possible by the intense collaboration with Francois Jacob, who eventually shared the Nobel. Their work led of course to the classic papers on the concept of the operon and of allosteric regulation of enzymes. In the late '60s Monod was politically involved with the student protest movement. In 1969 he gave four lectures at Pomona College, on "modern biology and natural philosophy," which became the core for Chance and Necessity - the book became an unexpected best seller in the early '70s. Monod then became a very effective manager and fund raiser, starting the first French institute of molecular biology, which now carries his name. He remained involved in politics, for instance in defense of abortion rights, until the premature end of his life.
"A beautiful theory may not be right, but an ugly one must be wrong." -JM
Bernardino Fantini, Monod's vision of life and the theoretical structure of contemporary biology.
Monod's philosophical work is largely under appreciated. It is true that he did not have a professional grounding in philosophy, but he was awake to the importance of philosophy in the biological sciences. According to Francis Crick's obituary of Monod in Nature, Chance and Necessity presented a vision of life that is shared by most practicing scientists, and yet feels alien to the majority of the public: life is an accident and Darwinian evolution is the impersonal causal mechanism that shaped it. Monod was interested in the apparent paradox of living organisms functioning in a way that cannot be explained only by the laws of physics and chemistry, which constitute the foundations of our scientific understanding of the world. He saw molecular biology not as a branch of chemistry, but rather as a biological-Darwinian understanding of biochemistry. Emphasis on biological form rather than specific matter constituents ("Plato sometimes is right" -JM). Monod saw evolution not as a law or a principle of life, but rather as an emergent result of complexity and certain environmental conditions. Monod attributed the idea that everything is the result of randomness and necessity to Democritus, though no specific quote to that effect can actually be found in the Greek atomist. For Monod life is bound by the laws of physics, but requires additional causal principles when it comes to the specificity of biological information. Delbruck quasi-seriously suggested to give the Nobel to Aristotle for the discovery of the basic principle of molecular biology, that DNA plays the role of the unmoved mover in biology. Many biologists rejected this idea that structure and function, form and information, can be conceptually separated in a way reminiscent of Aristotle's causes. Monod's ideas here derived naturally from his experimental work separating the control of enzymatic function from the biochemical function itself: allosteric control is entirely independent of the structural details of the functional enzyme.
Massimo Pigliucci, Biology as a historical and experimental science: the epistemic challenges of chance and necessity.
My talk was about situating the concepts of chance and necessity, in their broader sense, within the context of recent and ongoing discussions about the structure of evolutionary theory - from the Modern Synthesis of the 1940s to the newly proposed Extended Synthesis. I discussed the classic debate between Fisher and Wright, then moved to Gould's emphasis on contingency, at the same time that he was trying to establish paleontology on nomothetic grounds. I then used Cleland's distinction between prediction of future events and postdiction of past ones to mediate between experimental and historical aspects of evolutionary biology. I concluded with an overview of the Extended Synthesis as outlined in a MIT Press volume that I recently co-edited with Gerd Muller.
David Haussler, The genome 10k project, what we might learn from sequencing 10,000 vertebrate genomes.
Cost of DNA sequencing going down faster than cost of microprocessor power. Hence the idea of starting on a 10,000 - out of 60,000 known - vertebrate species genome project. Interested scientists and tissue samples sufficient for sequencing are available already for 16,000 species. Work made difficult by the structural / architectural changes in the various genomes over time, which superimpose on sequence-level changes. Still, one can follow both the birth of new genes, via duplication, and their death, via mutation causing a stop codon. The (rather naive?) long term scenario is to map genomic changes to phenotypic ones, thereby mapping the evolution of vertebrate form at the genomic level. An interesting early result is that early on in the phylogenetic history of vertebrate clades we observe an excess of regulatory innovation affecting transcription factors. This excess then tapers off, and regulatory elements become just as likely to mutate as other parts of the genome. On the other hand, changes in receptor binding sites become more important later in evolution, also eventually dropping off. Finally, more recent evolution is marked mostly by changes in intra-cellular signaling. So, early importance of developmental changes, intermediate period targeting intercellular-level changes, and finally intra-cellular changes. (This was an interesting talk on its merits, though it is hard to see what it had to do directly with the theme of the conference. I suspect this will be true for several other talks over the next couple of days.)
Gill Bejerano, Change and constancy in the evolution of the human genome.
Consider the contrast between having 20,000 protein coding genes vs about 1,000,000 genomic switches controlling the expression of those genes. A large number of cis non coding regions seem to have evolved under purifying selection. (Must admit that my eyes glaze over when slide after slide explains the various techniques used to gather the relevant molecular biology data...) (Still asleep, in the last two talks I have not heard the words "Monod," "chance," or "necessity" very much, if at all.)
Daniel Hartl, Chance favors the prepared genome, copy number variation and the origin of new genes.
Whole gene and partial duplications are frequent, though most of them are lost quickly. Chimeric combinations often lead to the evolution of new genes in Drosophila. The estimate is of about 100 duplications peer million years, 10% of which are chimeras. The two types of genes are then lost at the same rate. The rest of the talk focused on a couple of specific examples of the evolution of particular chimeric genes, one of which has been the locus of a recent - 15,000 years ago - selective sweep. The second example presented the case of a large number of structural events - deletions and insertions - which would maintain functionality only if they happened simultaneously. The way this happened was not by intelligent design ;-) but by way of resolving a stalled replication fork, which would have caused cell death at the moment of division. In other words, a number of molecular events that normally would be interpreted as having happened over a large number of generations likely occurred in a single molecular reshuffling inside an individual cell. (Talk about non-gradual evolution...)

Tuesday, October 19, 2010

CALL FOR ABSTRACTS WORKSHOP: Discovery in the social sciences: Towards an empirically-informed philosophy of social science

University of Leuven, Belgium, March 22-23, 2011
Submission deadline for abstracts: 31 December, 2010.
Notification of acceptance: January 15, 2011.
Keynote speakers
Alison Wylie (University of Washington)
Jack Vromen (Erasmus University Rotterdam)

Call for papers:
The aim of this workshop is to bring together scholars who are working in the philosophy of the social sciences, especially those interested in scientific practice. The theme is discovery in the social sciences.
We invite submissions of extended abstracts (about 1000 words), and we are especially eager to hear from young researchers, including graduate students, postdoctoral fellows, tenure-track professors and other recent PhDs, working in the philosophy of the social sciences or related fields. We are interested in both case studies that examine specific instances of discovery in social sciences, and in more theoretical or methodological papers that are informed by scientific practice. We take 'discovery' in a broad sense, meaning discovery of empirical phenomena, theories and laws. 'Social sciences' refers to a broad range of disciplines, including (but not limited to) economics, anthropology, history, archaeology, psychology (including neuroscience), linguistics, and sociology.

Possible topics (not an exhaustive list) include:
- What is specific to discoveries in the social sciences?
- What is the epistemic role of artefacts in discovery, for example in neuroscientific research?
- Can we discern patterns in discovery in the social sciences?
- The discovery of laws in social sciences.
- Case-studies of discovery in specific social sciences.
- Creativity in social scientific practice.

Please send your abstract, preferably as pdf or rtf to Helen De Cruz, using the following e-mail address @ (remove spaces) by December 31 2010. Please also indicate your position (e.g., graduate student, postdoc, assistant professor, etc).
Scientific committee: Helen De Cruz (University of Leuven), Eric Schliesser (Ghent University), Farah Focquaert (Ghent University), Raymond Corbey (University of Leiden and Tilburg University).
This workshop is supported by funding from the University of Leuven and Ghent University.

Saturday, October 16, 2010

Postdoc: Mellon Postdoctoral Fellowship at Wisconsin

The University of Wisconsin Madison invites applications for Mellon Postdoctoral Fellowships in the humanities and the humanistic social sciences. The theme for 2011-13 is Life, broadly construed.

Details about the fellowship can be found at

The deadline for applications is November 15, 2010. Applications should be sent electronically to:
If you have questions, please contact Jessica Courtier, Mellon Postdoctoral Fellows Coordinator, at that email address or phone her at 608.516.8109.

Friday, October 15, 2010

Leiter concedes, redoes poll

Brian Leiter has graciously accepted the arguments ( that his original poll on the most significant philosopher of science was marred by oversight:

I am very pleased that Duhem, Michael Polanyi, Moritz Schlick, David Lewis, Frank Ramsey, and David Hull are now all included (but no Weber, Russell, and Weyl, alas!!!). I suspect only Lewis will make a big dent on the list, but I think it is important to avoid encouraging the already existingbias toward the recent past in such polls, which do help shape the discipline's self-perception

Thursday, October 14, 2010

Most significant 20th century philosophers

Brian Leiter is running a poll of interest to readers of this blog:

He has acknowledged some significant oversights (Schlick and Hull). But as I point out here:
I think the situation is worse without Duhem, Russell, Weyl, and a few more controversial others (Husserl, Foucault, Zilsel, and Weber).
Chime in, and vote!

Sunday, October 10, 2010

Job: Tenure stream position in the HPS department at the University of Pittsburgh

POSITION: Tenure stream assistant professor in the Department of History and Philosophy of Science, pending budgetary approval.

Area of Specialization: History and philosophy of science and related areas that naturally complement departmental strengths. We have interest in strengthening areas of history and philosophy of neuroscience, physics, and general methodology.

Rank: Assistant professor

Responsibilities: Undergraduate and graduate teaching; regular departmental duties.

Applicants must submit the following materials, which will not be returned:

  • A curriculum vitae.
  • At least three confidential letters of reference.
  • Relevant academic transcripts.
  • Evidence of teaching ability.
  • Samples of recent writing.

The department regrets that it cannot solicit missing materials from applicants, or return any materials.

Please direct all inquiries and application materials regarding this position to:

The Appointment Committee
Department of History and Philosophy of Science
1017 Cathedral of Learning
University of Pittsburgh
Pittsburgh, PA 15260.

The University of Pittsburgh is an Affirmative Action, Equal Opportunity Employer. Women and members of minority groups underrepresented in academia are especially encouraged to apply.

Deadline for Applications: November 15, 2010

Please note that by accident this ad was not included in the October issue of the Job for Philosophers.

Tuesday, October 5, 2010

The financial corruption of the economics profession

[Apologies for x-posting this from Apps:, but the regulars here are familiar with my self-promotional activities!]
In general I argue that philosophers and citizens more generally ought to be more economically literate than they tend to be. In my view a lot of criticism of contemporary economics is based on conflation between political rhetoric and the complex reality of economic research. (Such criticism also often conflates a lot of different trends within economics.)

Nevertheless, there is a class of economists that have leveraged their economic expertise and have become part of revolving door between academia, industry, and government. (Often they also become apologists of worst abuses by foreign dictatorships from Left and Right!) What is significant about the piece below is that it exposes the financial incentives that tempt economists. It may be well over due that when economists publish journal articles and textbooks that they reveal not just research grants, but also their consulting fees/sources? It would be strange if economists, of all people, would think that (financial) incentives don't matter.

Thursday, September 30, 2010

Hacking and Franklin on the Functional Complexity of Evidence

After posting my paper here, in the last few days, I've just happened to come across two fabulous statements related to my position. Of course, just when you start to think you're doing something a little bit original, you come across all kinds of people saying basically the same thing.

Ian Hacking, on the first page of the monumental "Experimentation and Scientific Realism":
Experiments, the philosophers say, are of value only when they test theory. . . So we lack even a terminology to describe the many varied roles of experiment.  (Hacking 1982, p. 71)
And Allan Franklin, on the first page of his Selectivity and Discord:
Experiment plays many roles in science.  One of its important roles is to test theories and provide the basis for scientific knowledge.  It can also call for a new theory. . . Experiment can provide hints about the structure or mathematical form of a theory, and it can provide evidence for the existence of the entities involved in our theory. . . it may also have a life of its own, independent of theory: Scientists may investigate a phenomenon just because it looks interesting. Such experiments may provide evidence for future theories to explain. (Franklin 2002, p. 1)
It is a nice surprise to find myself in such good company.  The aim of my paper, of course, is to try to provide a coherent picture of and some terminology for the various roles of evidence.  One of the points that I make in the paper, which I'm not sure Hacking or Franklin would accept, is that there is a useful (functional) distinction to be drawn between observational and experimental evidence.  I suspect they might even say that I leave some roles out of my picture.

Tuesday, September 28, 2010

Milton Friedman and Richard Swinburne, coupled

Charles Manski, an economist at Northwestern associated with the prestigious NBER, has a working paper, POLICY ANALYSIS WITH INCREDIBLE CERTITUDE:
It explores an important topic, namely the tendency of policy sciences "to regularly express certitude about the consequences of alternative policy choices." In the paper Manski offers a typology of variants of this problem and offers an alternative. (I must thank one of my regular informants from within economics, Robert Goldfarb (who has done some lovely empirical work on how economists handle empirical data), for calling Manski to my attention!)
Now early in the paper Manski goes after Milton Friedman's famous (1953) methodology paper (known as F1953) and couples him with the philosopher Richard Swinburne (well known in metaphysics and philosophy of religion), and criticizes both of them for their advocacy of the simplest hypothesis at the exclusion of others. (To the best of my knowledge Milton Friedman has never been compared to Richard Swinburne before.)

So far so good. Then Manski writes: "Does use of criteria such as “simplicity” to choose one hypothesis among those consistent with the data promote good policy making? This is the relevant question for policy analysis. To the best of my
knowledge, thinking in philosophy has not addressed it."
Funny that. My recently published paper on the influence of Milton Friedman's methodology on the Chilean Chicago Boys explores precisely this issue:
Rarely have I had a better advocate for the relevancy of my work! there other work on the relationship between simplicity and policy science?

Sunday, September 26, 2010

Varieties of Evidence Redux

About a year ago, I posted three blog posts here, arguing that scientific evidence serves a more complex and dynamic set of functions in scientific inquiry than simply supporting hypotheses.  I've finally manage to work the idea out in a form that I'm satisfied with:

The Functional Complexity of Scientific Evidence (Draft)

I'm especially indebted to the commenters on this blog for the content of section 6, including Thomas Basbøll, Greg Frost-Arnold, Gabriele Contessa, and Eric Winsberg.  (I hope I've appropriate credit where credit is due there.  I was a bit stymied in how exactly to refer to a conversation we had on the blog, and so made the acknowledgments there fairly general.  Advice on that point is welcome.)

I hope I've managed to present it in a compelling way and answer the objections in a satisfactory way, even though I'm sure many traditionalist won't be convinced.  The goal in this paper is to motivate the need for more complex, functionalist, dynamic model of evidence in contrast with the oversimplification of the traditional-type model, to set out in detail such a model, to illustrate it with an example, and to reply to some basic objections.  I've got a second paper in progress which applies the basic framework to a variety of problems of evidence, from theory-ladenness and the experiment's regress to "evidence for use" and evidence-based public policy.  My central claim there is that this apparently diverse set of problems all share a set of assumptions, and the strongest way to solve them all is to adopt the dynamic evidential functionalism that I've laid out in this first paper.

One reason that I needed to whip this paper into shape is that I'm presenting on the topic of the sequel at the Pitt workshop on scientific experimentation.  Getting this in final form is part of finishing up that paper.  The working title there is "From the Experimenter’s Regress to Evidence-Based Policy: The Functional Complexity of Scientific Evidence."

If anyone gets a chance to look at the paper, I'd appreciate any comments, here or via email. 

Friday, September 24, 2010

1st Dutch-Flemish Graduate Conference on Philosophy of Science and/or Technology, Ghent 25-26 November

1st Dutch-Flemish Graduate Conference on Philosophy of Science and/or Technology
The NFWT organizes its first graduate conference for advanced master students, Phd-students, and recent Phd’s, working on philosophy of science and/or technology. The goal of this conference is to help such researchers establish a research network, and try out papers in a cordial setting. All participants will be alloted ca. 30 minutes to present a paper, followed by 15 minutes of discussion.
There will be two keynote lectures on the topic of “levels of organization in the life sciences”, and contributions related to this topic are especially encouraged, without this being an exclusionary criterion.
Abstract of maximum 500 words should be submitted no later than October 1, 2010, by email to: Notification of acceptance will be sent by October 10.
Dates: 25 and 26 November 2010
Venue: Het Pand, Ghent University, Ghent
Keynote speakers: Jon Williamson (Kent University) and Gertrudis Van de Vijver (Ghent University)
For more information on the NFWT (Dutch-Flemish Network for Philosophy of Science and Technology), see:

CFP: EPSA, Athens, Greece 5-8 sept, 2011.

The Third Conference of the European Philosophy of Science Association (EPSA) will take place at the University of Athens, Greece, 5-8 October 2011. Contributed papers and proposals for symposia are invited by 28 February 2011.
For details of the call, please visit this website:

Monday, September 20, 2010

PhD Position (Ghent)

The Department of philosophy and moral sciences Ghent University has a vacancy for a PhD researcher in connection with the research professorship of Prof. Dr. Eric Schliesser. The area of interest is open with a slight preference for candidates interested in philosophy and history of economics, history and philosophy of science, early modern philosophy (from Descartes to Kant), and metaphysics.
For more information:

Wednesday, September 15, 2010

CFP: NOVEL PREDICTIONSFebruary 25-26 2011, Heinrich-Heine Universitaet Duesseldorf, Germany.

Organisers: Gerhard Schurz, Ludwig Fahrbach and Ioannis Votsis

Invited Speakers: Martin Carrier (Bielefeld), Deborah Mayo (Virginia
Tech), Cornelis Menke (Bielefeld), Stathis Psillos (Athens), Roger White
(MIT) and John Worrall (LSE).

The aim of the conference is to explore new and fruitful answers to
three central questions: What are novel predictions? Ought novel
predictions have more epistemic weight than mere accommodations? Can
novel predictions help us make headway in the scientific realism debate?
We expect that the talks will cover one or more of the following related
topics, simplicity, unification, curve-fitting, approximate truth,
inference to the best explanation, the no-miracles argument and
scientific theory change.

We invite abstracts of up to 500 words on any of the above or closely
related topics. Please e-mail contributions to Ioannis Votsis ( ). Make sure to include your full
name, institutional affiliation and e-mail address.

Submission Deadline: 15 OCTOBER 2010
Acceptance Notification: 15 NOVEMBER 2010

We hope to publish the proceedings of the conference in a reputable
scientific journal. Upon completion of the conference, we will invite
participants to submit written-up versions of their talks. Submitted
papers will then be subjected to a peer-review process.

Speakers – Provisional Talk Titles:
Martin Carrier (Bielefeld) 'Prediction in Context: On the Comparative
Epistemic Merit of Predictive Success'
Deborah Mayo (Virginia Tech) 'Some Surprising Facts About (the problem
of) Surprising Facts'
Ludwig Fahrbach (Duesseldorf) 'Novel Predictions: In Search of the
Cornelis Menke (Bielefeld) 'On the Vagueness of "Novelty" and Chance as
an Explanation of Predictive Success'
Stathis Psillos (Athens) 'Novelty-in-Use: On Perrin's Argument for
Gerhard Schurz (Duesseldorf) 'Theoretical Parameters and Use-Novelty
Criterion of Confirmation'
Ioannis Votsis (Duesseldorf) 'Novel Predictions: The Few Miracles
Argument for Scientific Realism'
Roger White (MIT) 'Testing'
John Worrall (LSE) 'Prediction and Accommodation: A Comparison of Rival

Attendance is open to all. If you plan to attend please contact Ioannis
Votsis ( ).

CFP: THEORY-LADENNESS OF EXPERIENCE March 10-11 2011, Heinrich-Heine Universitaet Duesseldorf, Germany.

Organisers: Gerhard Schurz, Michela Tacca and Ioannis Votsis
Invited Speakers: William Brewer (Illinois, Urbana-Champaign), Allan Franklin (Colorado), Martin Kusch (Vienna), Athanassios Raftopoulos (Cyprus), Susanna Siegel (Harvard) and Markus Werning (Bochum).

The aim of the conference is to bring together philosophers, psychologists and cognitive scientists whose work contributes to our understanding of the scope and limits of theory-ladenness phenomena, where these are broadly construed to include the domains of perception, scientific evidence and language. We hope that the resulting synergy will help provide novel and fruitful answers to questions like the following: Is perception cognitively penetrable and, if so, how? Does the choice of scientific theory affect how we select, interpret and assess the evidential worth of
data from experiments? Under what circumstances can we doubt the veridicality of scientific instruments? Can we draw a sharp distinction between terms that are theoretical and those that are observational? We thus expect that the talks will deal with one or more of the following topics: the modularity of mind, nonconceptual content, the epistemology of evidence and the semantics of observational terms.

We invite abstracts of up to 500 words on any of the above or closely related topics. Please e-mail contributions to Ioannis Votsis ( ). Make sure to include your full name, institutional affiliation and e-mail address.
Submission Deadline: 01 NOVEMBER 2010
Acceptance Notification: 01 DECEMBER 2010

We hope to publish the proceedings of the conference in a reputable scientific journal. Upon completion of the conference, we will invite participants to submit written-up versions of their talks. Submitted papers will then be subjected to a peer-review process.

Speakers – Provisional Talk Titles:
William Brewer (Illinois, Urbana-Champaign) 'Naturalized Approaches to Theory Ladenness: Evidence from Cognitive Psychology History, and the Ecological Validity Argument'
Allan Franklin (Colorado) 'Theory Ladenness and the Epistemology of Experiment'
Martin Kusch (Vienna) 'Modules and Microscopes'
Athanassios Raftopoulos (Cyprus) 'Cognitive Impenetrability, Nonconceptual Content, and Theory-Ladenness'
Gerhard Schurz (Duesseldorf) 'Ostensive Learnability as Criterion for Theory-Neutral “Observation” Concepts'
Susanna Siegel (Harvard) 'Cognitive Penetrability and Perceptual Belief'
Michela Tacca (Duesseldorf) 'Cognitive Penetrability and the Content of Perception'
Ioannis Votsis (Duesseldorf) 'The Observation-Ladenness of Theory'
Markus Werning (Bochum) 'The Role of Action in Perception'

Attendance is open to all. If you plan to attend please contact Ioannis Votsis ( ).

The Limits of Science

Philosophy of science in the public domain:
I think Gotliebb is a bit unfair to the skeptics, but still pretty decent stuff.

Tuesday, September 14, 2010

Speculative vs experimental philosophy

There is a new Otago-based blog centered on a fun, timely, and interesting HPS project:
With the rise of experimental philosophy, renewed interest in earlier attempts at experimental philosophy are timely, and I wish the Otago group much luck!
One of the main conceits behind the Otago project is that the Empiricism-Rationalism distinction is a construct of Kantian philosophy and misdescribes Early modern philosophy. This view is widespread among Early modern scholars, although I wouldn't be surprised if a majority of practitioners still buy into some version of the distinction. The Otago group proposes another distinction, that between speculative and experimental philosophers. And that framework drives the project. This has three virtues: 1. The distinction can be mapped onto debates within contemporary philosophy; 2. It's a distinction that does justice to much 17th century thought (it is an actor's category) 3. It allows the group to have a coherence and economies of scale (to use grant-speak).
Now as the wording of my second virtue suggests, I have some qualms. It ignores at least one other group of philosophers, namely those that believed in (mathematical) theory mediated measurement. I am thinking of Galileo, Huygens, and Newton, among the best known. These are not best described as experimental, although all were accomplished experimentalists (and Newton's oOptics is often assimilated to experimental traditions), but their work has very different character from say, Bacon or Boyle. (They are also not best described as speculative, because all three practiced a self-restraint on published speculation.) Certainly after the Principia this approach created standing challenge to all other forms of philosophizing. So the Otago framework will run into big trouble in 18th century.
I have argued that a better contrast can be drawn between those who thought that inspecting ideas (whatever the source--so this includes rationalists and empiricists) was the way forward and those who advocated theory mediated measurement. Moreover, it turns out that this distinction maps onto a related one: between system-building philosophers and the piecemeal approach, and I think better clarifies the predicaments of our philosophic times. But about these matters some other time.

Sunday, September 12, 2010


In re-reading Quentin Skinner's classic "Meaning and Understanding in History of Ideas" I was struck that Skinner welds together a Witgensteinian philosophy of language (and Anscombian philosophy of action) with Kuhnian philosophy of science (All acknowledged in the text). (Given the intellectual proximity of Kuhn and Cavell something of this sort can also be found in Kuhn's writings.) The resulting therapeutic aims for the (contingency in, contingency out model supplied to the) historical sciences are only mildly to my liking, but about that some other time. Here my question is does anybody know if in all the writings on Kuhn anybody has targeted or clearly diagnosed the Wittgenstein appropriation of Kuhn or the Wittgensteinian elements in Kuhn?

Thursday, September 2, 2010

Philosophy of statistical mechanics

David Albert has a (rather self-indulgent--yes, and that coming from me!), but usefully critical review of a collection of essays on the philosophy of statistical mechanics edited by Gerhard Ernst and Andreas Hutteman, which includes chapters by several contributors to this blog. The review can be found here:

Maybe it's time [sic] for à good discussion?

Tuesday, August 31, 2010

Second Young Researchers Days & Workshop on the Relations between Logic, Philosophy and History of Science

September 6-7, 2010, Palais des Académies, Rue Ducale / Hertogstraat 1, Brussels.
If you happen to be in the Low Countries next week, this should be fun:

Sunday, August 29, 2010

A mainstream economist admits the obvious [told you!]

It is rare to hear a prominent mainstream economist discuss so frankly (in public) the ways in which non trivial value judgments enter into welfare economics and public pronouncements of economists:

I probably shouldn't say, "I told you so," but...I told you so:
[The published version will be available soon:]

Moreover, elsewhere I tell the story how even at Chicago-Economics (where they were early and rather trenchant critics of the claims of value-neutrality of welfare economics), the new welfare economics was adopted:
For philosophers this paper may be entertaining (or a cautionary note) because I show how Kuhn's ideas were both anticipated and then aggressively promoted to create a mythic history (and, thus stiffle dissent) at 'Chicago'.

Friday, August 27, 2010

Duck and drake clusters

The following post about homeostatic property clusters (HPCs) is pretty long, so I've split it into several sections. Here's the very short version: Ereshefsky and Matthen argue that the HPC approach to natural kinds fetishizes similarity and is undone by polymorphism. I argue that it's not, and that the HPC approach is really about looking for causal structure.
[crossposted at Footnotes on Epicycles]

Thursday, August 26, 2010

SEP on ENlightenment

I love the Stanford Encyclopedia of Philosophy, which is why I try to read (well, scan) most of new and updated entries. You can, too:
So, I really don't want to be known for kvetching about SEP (as I did recently:

But while I picked on the Copernicus article because of my own (no doubt rather eccentric) pet-peeves, the entry on "Enlightenment" is based on claims that do not withstand scrutiny. It is also clearly informed by a self-serving German (if not outright Kantian as understood by certain Rawlsians) historiography of Enlightenment. (This dawned upon me when I read that "Only late in the development of the German Enlightenment, when the Enlightenment was near its end, does the movement become self-reflective." Such a bizarre claim is only possible because Rousseau, who famously challenged the value of Enlightenment, is treated as an entirely moral-political thinker; his three Discourses are not even mentioned in the bibliography! [The secondary literature bibliography is rather limited.]

In what follows, I have tried to emphasize the HPS relevance of my concern. (This is not a reach because Newton plays a crucial role in the narrative:
So when WIlliam Bristow writes, "It belongs centrally to the agenda of Enlightenment philosophy... to provide a metaphysical framework within which to place and interpret this new knowledge" he imposes the Kantian conception onto the subject; for many Enlightenment thinkers natural philosophy makes metaphysics irrelevant.)

Here are two claims from the entry's very first paragraph that reveal some of the article methodological and historical flaws:
I. "Enlightenment thought culminates historically in the political upheaval of the French Revolution." If we think in strict calendar-periods--then one might be inclined to agree. But a) now it looks like the French [why not American?] Revolution is a kind of teleological outcome of Enlightenment thought; this goes against the self-understanding of a lot of politically-gradualist Enlightenment thinkers (especially in Scotland). And b) if the Enlightenment is a kind of regulative ideal (for future-oriented action), then the French revolution may mark the real (as opposed to merely theoretical) possibility of Enlightenment, but by no means its completion. (Think of Lincoln at Gettysburg who turned the US Constitution into an open-ended project.) This option not irrelevant for those (i.e., many eighteenth century historians) that wish to have a *science of history* that can shape the future. C) Why think that Enlightenment must culminate in political events rather than in a change of attitudes or knowledge?

II "The dramatic success of the new science in explaining the natural world, in accounting for a wide variety of phenomena by appeal to a relatively small number of elegant mathematical formulae, promotes philosophy (in the broad sense of the time, which includes natural science) from a handmaiden of theology, constrained by its purposes and methods, to an independent force with the power and authority to challenge the old and construct the new, in the realms both of theory and practice, on the basis of its own principles."
Well, no. A lot of philosophy (including natural philosophy) remained in some respects a handmaiden of theology or natural theology. Newtonianism routinely got connected with theological (theo-cosmological) arguments. (It is as if Weber and Merton never wrote.) Many of the folk that are most eager to see philosophy end its handmaiden role (Spinoza, Hume, Diderot) are also most ambivalent about the course of mathematical natural philosophy. [Not to mention that there is now a very rich literature on Catholic Enlightenments.]
The whole article conflates secularization and the advancement of science (as well as the idea of progress).

I could go on and on, paragraph by paragraph (and maybe I will in future postings), but this is long enough for now.

Tuesday, August 24, 2010

cfp - Graduate Conference on Philosophy of Science and/or Technology GHENT

1st Dutch-Flemish Graduate Conference on Philosophy of Science and/or Technology
The NFWT organizes its first graduate conference for advanced master students, Phd-students, and recent Phd’s, working on philosophy of science and/or technology. The goal of this conference is to help young researchers establish a research network, and try out papers in a cordial setting. All participants will be alloted ca. 30 minutes to present a paper, followed by 15 minutes of discussion.
There will be two keynote lectures on the topic of “levels of organization in the life sciences”, and contributions related to this topic are especially encouraged, without this being an exclusionary criterion.
Abstract of maximum 500 words should be submitted no later than October 1, 2010, by email to: Notification of acceptance will be sent by October 10.
Dates: 25 and 26 November 2010
Venue: Het Pand, Ghent University, Ghent
Keynote speakers: Jon Williamson (Kent University) and Gertrudis Van de Vijver (Ghent University)

For more information on the NFWT (Dutch-Flemish Network for Philosophy of Science and Technology), see:

Thursday, August 19, 2010

Copernicus at Stanford Encyclopedia of Philosophy

The entry on Copernicus has been updated at the Stanford Encyclopedia of Philosophy:

I want to offer one minor kvetch. The article claims: Copernicus "was responsible for the administration of various holdings, which involved heading the provisioning fund, adjudicating disputes, attending meetings, and keeping accounts and records. In response to the problem he found with the local currency, he drafted an essay on coinage (MW 176–215) in which he deplored the debasement of the currency and made recommendations for reform. His manuscripts were consulted by the leaders of both Prussia and Poland in their attempts to stabilize the currency."

This is all what's said about the matter! Now, this understates the significance of Copernicus on these matters. First Copernicus articulated what is often known as Gresham's Law well before Gresham. (See wikipedia here: More important, Copernicus articulated what is known as the quantity theory of money (often attributed to David Hume). Again, see wikipedia:

The quantity theory is a major conceptual and 'scientific' achievement. It is a milestone in economic theorizing. Now, by failing to investigate this more fully, the entry at SEP perpetuates the blindness among philosophers to a) the shared history between philosophy and economics (and political economy); b) their ongoing mutual development; c) makes Copernicus' interest in theorizing about currency (shared by Galileo, Newton, Locke, Berkeley, and Hume) seem largely insignificant.

End of rant!


13-15 May 2011, University of Toronto. Presented by the Institute for the History and Philosophy of Science and Technology, University of Toronto and the Fishbein Center for the History of Science and Medicine, University of Chicago

The philosophy of science has an illustrious history of attraction and antipathy towards metaphysics. The latter was famously exemplified in the Logical Positivist contention that metaphysical questions are meaningless, but in the wake of the demise of Positivism, metaphysics has found its way back into the philosophy of science. Increasingly, questions about the nature of natural laws, kinds, dispositions, and so on have taken a metaphysical cast. The metaphysics of science
commands significant attention in contemporary philosophy.
While many philosophers embrace the increased contact between metaphysics and the philosophy of science, others are wary. Should science (and its philosophical study) lead us into doing metaphysics? If so, which metaphysical issues are genuine and which are illusory, and how might we tell? Such questions dovetail with similar soul-
searching in metaphysics proper (sometimes under the banner of "meta-metaphysics", sometimes simply as methodology).
This conference will examine ground-level debates about metaphysics within the philosophy of physics and the philosophy of biology, and broader methodological questions about the role of metaphysics in the philosophy of science. Participation is open and welcome from all parties to these questions: from those who hold that metaphysics must have a place within the philosophy of science, to those who hold it
should not.

Craig Callender (University of California, San Diego)
Anjan Chakravartty (University of Toronto)
Katherine Hawley (University of St. Andrews)
Jenann Ismael (University of Arizona)
James Ladyman (University of Bristol)
Kyle Stanford (University of California, Irvine)
Michael Strevens (New York University)
Robert Wilson (University of Alberta)
C. Kenneth Waters (Minnesota)

Essays of 4,000-5,000 words (30 minutes allotted for presentations) concerning any aspect of metaphysics and the natural or social sciences will be accepted for review until January 10, 2011. Please include a short abstract (200 words or so), a few keywords, prepare your essay for blind review (do not include your name or other
identifying references in the document), and submit it in PDF format here:
Notification by early February 2011.

Chris Haufe (University of Chicago)
Matthew H. Slater (Bucknell University)
Zanja Yudell (California State University, Chico)
Please direct general conference inquiries to

Tuesday, August 17, 2010

Homage to Ian Mueller

I was in the Chicago philosophy graduate program during the 1990s. My
primary field of study was philosophy of physics, but I spent a good
third of my time on ancient Greek philosophy as well, most of it with
Ian. I adored Ian, both personally and professionally. I feel
privileged to have been his student, and even more to have known him
as a person. I find as I make my way through the world of academic
philosophy that by and large the people who know Ian---and when
someone in the field knows Ian, they invariably revere him---are those
people who themselves do the finest work.

Ian was a philosopher's philosopher---a true scholar and open-minded
thinker who never let his astonishing carefulness and thoroughness
degenerate into pedantry. He was the only person I know who could
make the commentaries and the apparatuses fun. (Indeed, this is the
thanks I gave him in the "Acknowledgments" section of my doctoral
dissertation, the second person I thanked there: "It is a pleasure to
acknowledge and thank the following people.... Ian Mueller---for
exemplifying the spirit of careful scholarship, and for making me
realize that sometimes (not often, but sometimes) studying the
secondary literature can be almost as rewarding as reading the
original text.")

This is one of my fondest memories of Ian. We were in the weekly
group he used to lead on Aristotle's *Metaphysics*, going through a
particularly difficult passage in Book Lambda, as always going through
the text line by line, word by word (while always keeping an eye
firmly fixed on the bigger picture). At one point, I recalled that
Ross, in the commentary to his edition of the Greek, had an
interesting take on a disputed reading, so I offered my recollected
gloss on it. Ian looked puzzled, and said surely that was not right,
that was not what Ross had said. I guess I was feeling cocky, because
normally I would have deferred to Ian's mastery of the apparatus, but
on that occasion I was sure I was right and said so. Like dueling
gunslingers, Ian and I simultaneously and gleefully (albeit, Ian in
his understated way) reached for our copies of Ross and scrambled to
beat each other to the relevant part of the commentary. At about the
same moment, again, we each declared ourselves to be right. And
looked at each other puzzled, because we could not both be right.
After a moment's confusion, we worked out that I had the second
edition of Ross and Ian had the first. I figured that was the end of
the matter, but Ian asked to see my copy. Lovingly he lay the two
editions side by side and perused them in turn for several moments,
working out the details and subtleties of Ross's apparent change of
heart, clearly trying to figure out not only the substance but the
reasons behind it. Finally, dreamily, he looked up, eyes on the
Platonic Heaven, and said softly, "God help me, I love this stuff."

I tried to tell Ian several times how much he meant to me, how much he
had contributed to my intellectual development---how much of my
teaching and research, even to this day, even on topics not related to
ancient philosophy, is still done with him consciously in my mind as a
paragon. He always brushed it aside with a shy modesty that was
humbling to me. I know full well that I am far from the only one of
Ian's ex-students to feel this way.

Monday, August 16, 2010


As Brian Leiter reported The New Times has recruited Timothy Williamson for its online blog, the Stone. In a recent entry (perhaps his first?) he writes about the role of the imagination in science:

The main point of the entry is revealed in its closing paragraph. It is to answer unnamed "Critics of contemporary philosophy" who "sometimes complain that in using thought experiments it loses touch with reality...Once imagining is recognized as a normal means of learning, contemporary philosophers’ use of such techniques can be seen as just extraordinarily systematic and persistent applications of our ordinary cognitive apparatus."

I offer four observations:
1. First, Williamson makes it easy on himself by simply asserting without evidence that contemporary philosophers’ use of imagination can be seen as just extraordinarily systematic and persistent applications of our ordinary cognitive apparatus. The blog clearly implies that if the imagination is good enough for science it is good enough for philosophy. But Williamson makes no effort to show that contemporary philosophers systematically constrain the use of the imagination in the manner that scientists (perhaps?) do. He just asserts philosophers' systematicity and persistence. (The piece ends a line later.) This is an argument from authority.

2. Nevertheless, my reason for blogging about this entry is not to continue to harping about the tendency of leading analytic philosophers to claim the mantle of science when it suits them. Rather, it is to note the surprising (to me!) impact of recent (well, post-Kuhnian!) history and philosophy of science on Williamson's thought in at least two ways. First, Williamson takes the context of discovery very seriously. It is what grounds his appeal to the authority and use of the imagination. Second, he asserts that even in the context of justification the imagination plays a very important role, and this is a good thing.

3. So, perhaps philosophers of science can engage Williamson on these two previous points in constructive fashion? The recent methodological turn of my leading (and young) analytic metaphysicians should be an opportunity in this respect.

4. I end with a historical note. Williamson's position is a rediscovery of David Hume's and especially his friend's Adam Smith's understanding of science. In Smith's "The History of Astronomy," the imagination plays a positive constructive and justificatory role in natural science and philosophy: "Philosophy, therefore, may be regarded as one of those arts which address themselves to the imagination." As Smith writes, "For, though it is the end of Philosophy, to allay that wonder, which either the unusual or seemingly disjointed appearances of nature excite, yet she never
triumphs so much, as when, in order to connect together a few, in themselves,
perhaps, inconsiderable objects, she has, if I may so, created another
constitution of things, more easily attended to, but more new, more contrary
to common opinion and expectation, than any of those appearances themselves."
(IV.33, 75)