Von Neumann, who could have aspired to such a role, admitted to me some thirty-five years ago that he knew less than a third of the corpus of mathematics. At his suggestion once I concocted for him a doctoral-style examination in various fields trying to select questions which he would not be able to answer. I did find some, one each in differential geometry, in number theory, in algebra, which he could not answer satisfactorily. (This by the way may also tend to show that doctoral exams have little permanent meaning.)
As for myself, I cannot claim that I know much of the technical material of mathematics. What I may have is a feeling for the gist, or maybe only the gist of the gist, in a number of its fields. It is possible to have this knack for guessing or feeling what is likely to be new or already known, or else not known, in some branch of mathematics where one does not know the details. I think I have this ability to a degree and can often tell whether a theorem is known, i.e. already proved, or is a new conjecture. This is a sort of feeling that comes from the way the quantifiers are arranged, from the tone or the music of the statement, so to speak.
Speaking of this analogy: I can remember tunes and am able to whistle various melodies rather correctly. But when I try to invent or compose some new "catchy" tune, I find rather impotently that what I do is a trivial combination of what I have heard. This in complete contrast to mathematics where I believe with a mere "touch" I can always propose something new.
Collaboration in mathematics is a very interesting and new phenomenon which developed during the last several decades.
It is natural in experimental physics that investigators work together on the different phases of instrumentation. By now every experiment is really a class of technical projects, especially on the great machines which require hundreds of engineers and specialists for their construction and operation. In theoretical physics this is perhaps not as evident, but it exists, and strangely enough in mathematics also. We have seen that the creative effort in mathematics requires intense concentration and constant thinking in depth for hours on end, and that it is often shared by two individuals who just look at each other and occasionally make a few remarks when they collaborate. It is now definitely so that even in the most abstruse mathematical questions two or more persons work together on trying to find a proof. Many papers have now two, sometimes three or more authors. The exchange of conjectures, suggesting tentative approaches, helps to build up partial results along the way. It is easier to talk than to write down every thought. There is here an analogy to analyzing a game of chess.
It may be that in the future large groups of mathematicians working together will produce important, beautiful, and simple results. Some have already been produced this way in recent years. For example, the solution of one of Hilbert's problems about the existence of algorithms to solve diophantine equations was really obtained (not in parallel to be sure but in sequence) by several scientists in this country, and at the end by a young Russian, Yuri Matiasevic, who took the last step. Several mathematicians working independently in the United States and in Poland but aware of each other's results, solved an old problem of Banach's about the homeomorphism of his spaces. They were able to climb on each other's shoulders, so to speak.
It was after the publicity surrounding the construction of the atomic bomb in Los Alamos that the expression "critical mass" became current as a metaphoric description of the required minimal size of a group of scientists working together in order to obtain successful results. If large enough, the group produces results explosively. When the critical mass is reached, due to mutual stimulation the multiplication of results, like that of neutrons, becomes exponentially larger and more rapid. Before such a mass is attained, progress is gradual, slow and linear.
Other variations in the working habits of scientists have been slower. The mode of life in the ivory tower world of science now includes more scientific meetings, more involvement in governmental work.
A simple but important thing like letter-writing has also undergone a noticeable change. It used to be an art, not only in the world of literature. Mathematicians were voluminous letter writers. They wrote in longhand and communicated at length intimate and personal details as well as mathematical thoughts. Today the availability of secretarial help renders such personal exchanges more awkward, and as it is difficult to dictate technical material scientists in general and mathematicians in particular exchange fewer letters. In my file of letters from all the scientists I have known, a collection extending over more than forty years, one can see the gradual, and after the war accelerated, change from long, personal, handwritten letters to more official, dry, typewritten notes. In my correspondence of recent years, only two persons have continued to write in longhand: George Gamow and Paul Erdös.
Chen Ning Yang, the Nobel prize physicist, tells a story which illustrates an aspect of the intellectual relation between mathematicians and physicists at present:
One evening a group of men came to a town. They needed to have their laundry done so they walked around the city streets trying to find a laundry. They found a place with the sign in the window, ''Laundry Taken in Here." One of them asked: "May we leave our laundry with you?" The proprietor said: "No. We don't do laundry here." "How come?" the visitor asked. "There is such a sign in your window." "Here we make signs," was the reply. This is somewhat the case with mathematicians. They are the makers of signs which they hope will fit all contingencies. Yet physicists have created a lot of mathematics.
In some of the more concrete parts of mathematics — for example probability theory — physicists like Einstein and Smoluchowski have opened certain new areas even before mathematicians. The ideas of information theory, of entropy of information and its role in general continuum originated with physicists like Leo Szilard and an engineer, Claude Shannon, and not with "pure" mathematicians who could and ought to have done so long before. Entropy, a property of a distribution, was a notion originating in thermodynamics and was applied to physical objects. But Szilard (in very general terms) and Shannon defined this notion for general mathematical systems. True, Norbert Wiener had some part in the origin of it and wonderful mathematicians like Andrei Kolmogoroff later developed, generalized, and applied it to purely mathematical problems.
In the past some mathematicians, Poincaré for example, knew a lot of physics. Hilbert did not seem to have too much true physical instinct, but he wrote very important papers about the techniques and the logic of physics. Von Neumann knew a good deal of physics too, but I would say that he did not have the physicist's natural feeling for and recourse to experiment. He was interested in the foundations of quantum mechanics as long as they could be mathematized. The axiomatic approach to physical theories is to physics what grammar is to literature. Such mathematical clarity need not be conceptually crucial for physics.
On the other hand, much of the apparatus of theoretical physics and occasionally some precursor ideas came from pure mathematics. The general non-Euclidean geometries prophetically envisaged by Riemann as having future importance for physics, came before general relativity, and the definition and study of operators in Hilbert space came before quantum mechanics. The word spectrum, for example, was used by mathematicians long before anybody would have dreamed of using the spectrum representation of Hilbert space operators to explain the actual spectrum of light emitted by atoms.