First paragraph from the Wikipedia article on the topic:
"In futures studies, a technological singularity (often the Singularity) is a predicted future event believed to precede immense technological progress in an unprecedentedly brief time. Futurists give varying predictions as to the extent of this progress, the speed at which it occurs, and the exact cause and nature of the event itself.
One school of thought centers around the writings of Vernor Vinge, in which he examines what I. J. Good (1965) described earlier as an "intelligence explosion." Good predicts that if artificial intelligence reaches equivalence to human intelligence, it will soon become capable of augmenting its own intelligence with increasing effectiveness, far surpassing human intellect. In the 1980s, Vernor Vinge dubbed this event "the Singularity" and popularized the idea with lectures, essays, and science fiction. Vinge argues the Singularity will occur following creation of strong AI or sufficiently advanced intelligence amplification technologies such as brain-computer interfaces.
Another school, promoted heavily by Ray Kurzweil, claims that technological progress follows a pattern of exponential (or super-exponential) growth, suggesting rapid technological change in 21st century. Kurzweil considers the advent of superhuman intelligence to be part of an overall exponential trend in human technological development seen originally in Moore's Law and extrapolated into a general trend in Kurzweil's own Law of Accelerating Returns. Unlike a hyperbolic function, Kurzweil's predicted exponential model never experiences a true mathematical singularity.
While some regard the Singularity as a positive event and work to hasten its arrival, others view it as dangerous, undesirable, or unlikely to occur. The most practical means for initiating the Singularity are debated, as are how (or whether) it can be influenced or avoided if dangerous." (http://en.wikipedia.org/wiki/Technological_singularity)