Wednesday, June 15, 2011

Is Software Improving Exponentially?

In a discussion on the AGI email discussion list recently, some folks were arguing that Moore's Law and associated exponential accelerations may be of limited value in pushing the world toward Singularity, because software is not advancing exponentially.

For instance Matt Mahoney pointed out "the roughly linear rate of progress in data compression as measured over the last 14 years on the Calgary corpus, "

Ray Kurzweil's qualitative argument in favor of the dramatic acceleration of software progress in recent decades is given in slides 104-111 of his presentation here.

I think software progress is harder to quantify than hardware progress, thus less often pointed to in arguments regarding technology acceleration.

However, qualitatively, there seems little doubt that the software tools available to the programmer have been improving damn dramatically....

Sheesh, compare game programming as I did it on the Atari 400 or Commodore 64 back in the 80s ... versus how it's done now, with so many amazing rendering libraries, 3D modeling engines, etc. etc. With the same amount of effort, today one can make incredibly more complex and advanced games.

Back then we had to code our own algorithms and data structures, now we have libraries like STL, so novice programmers can use advanced structures and algorithms without understanding them.

In general, the capability of programmers without deep technical knowledge or ability to create useful working code has increased *incredibly* in the last couple decades…. Programming used to be only for really hard-core math and science geeks, now it's a practical career possibility for a fairly large percentage of the population.

When I started using Haskell in the mid-90s it was a fun, wonderfully elegant toy language but not practical for real projects. Now its clever handling of concurrency makes it viable for large-scale projects... and I'm hoping in the next couple years it will become possible to use Haskell within OpenCog (Joel Pitt just made the modifications needed to enable OpenCog AI processes to be coded in Python as well as the usual C++).

I could go on a long time with similar examples, but the point should be clear. Software tools have improved dramatically in functionality and usability. The difficulty of quantifying this progress in a clean way doesn't mean it isn't there...

Another relevant point is that, due to the particular nature of software development, software productivity generally decreases for large teams. (This is why I wouldn't want an AGI team with more than, say, 20 people on it. 10-15 may be the optimal size for the core team of an AGI software project, with additional people for things like robotics hardware, simulation world engineering, software testing, etc.) However, the size of projects achievable by small teams has dramatically increased over time, due to the availability of powerful software libraries.

Thus, in the case of software (as in so many other cases), the gradual improvement of technology has led to qualitative increases in what is pragmatically possible (i.e. what is achievable via small teams), not just quantitative betterment of software that previously existed.

It's true that word processors and spreadsheets have not advanced exponentially (at least not with any dramatically interesting exponent), just as forks and chairs and automobiles have not. However, other varieties of software clearly have done so, for instance video gaming and scientific computation.

Regarding the latter two domains, just look at what one can do with Nvidia GPU hardware on a laptop now, compared to what was possible for similar cost just a decade ago! Right now, my colleague Michel Drenthe in Xiamen is doing CUDA-based vision processing on the Nvidia GPU in his laptop, using Itamar Arel's DeSTIN algorithm, with a goal toward providing OpenCog with intelligent visual perception -- this is directly relevant to AGI, and it's leveraging recent hardware advances coupled with recent software advances (CUDA and its nice libraries, which make SIMD parallel scientific computing reasonably tractable, within the grasp of a smart undergrad like Michel doing a 6 month internship). Coupled acceleration in hardware and software for parallel scientific computing is moving along, and this is quite relevant to AGI, whereas the relative stagnation in word processors and forks really doesn't matter.

Let us not forget that the exponential acceleration of various quantitative metrics (like Moore's Law) is not really the key point regarding Singularity, it's just an indicator of the underlying progress that is the key point.... While it's nice that progress in some areas is cleanly quantifiable, that doesn't necessarily mean these are the most important areas....

To really understand progress toward Singularity, one has to look at the specific technologies that most likely need to improve a lot to enable the Singularity. Word processing, not. Text compression, not really. Video games, no. Scientific computing, yes. Efficient, easily usable libraries containing complex algorithms and data structures, yes. Scalable functional programming, maybe. It seems to me that by and large the aspects of software whose accelerating progress would be really, really helpful to achieving AGI, are in fact accelerating dramatically.

In fact, I believe we could have a Singularity with no further hardware improvements, just via software improvements. This might dramatically increase the financial cost of the first AGIs, due to making them necessitate huge server farms ... which would impact the route to and the nature of the Singularity, but not prevent it.


Bob Mottram said...

Well as a practitioner my view on this is rather different.

There hasn't been much progress in software tools in the last couple of decades, although there are more libraries and APIs available. The business of software creation is still very much stuck in a text editing mode of operation, and visual IDEs don't seem to have altered all that much in the last 15 years. The fact that it was possible to be a commercial programmer using Emacs in 1991 and also in 2011 says something about how little things have changed.

This doesn't mean that there hasn't been progress in the amount and quality of software produced, and it's certainly true that more libraries are available. The greater availability of free software also increases the rate at which larger software systems can be assembled, as does the availability of web search engines to look up programming information - replacing large expensive reference books.

Bob Mottram said...

Also, whether software is improving depends upon what you mean by improving. Software typically performs some utilitarian or entertainment function and as such can't be easily or cleanly separated from the cultural context within which this activity takes place. Due to changes in culture and technological infrastructure a particular computation which was performed in 1991 may no longer make any sense in the world of 2011.

Is software better adapted to its cultural context than it was a couple of decades ago? Yes I think it is now much more deeply embedded into and synchronised with the overall culture.

That software can continue to to embed itself into the culture at some "exponential rate" - whatever that may mean - seems unlikely, and it may be that a significant fraction of the informational part of the industrial revolution is now behind us.

abramdemski said...

I think the data I was referring to win my argument on the AGI list was something like taking constraint satisfaction algorithms (or logical satisfiability algorithms) from the 90s vs today, run on the same computing hardware. The amount of improvement is significant. Obviously 2 data points doesn't say much about exponential vs other options, but at least it's a clear case of large improvements on the software side.

Kaj Sotala (Xuenay) said...

Ben, I always thought that Brooks' "No Silver Bullet" made a rather strong case against the possibility of exponential software improvement. What's your take on that?

Ben Goertzel said...

Kaj, about Brooks' argument, I agree that nowadays a lot more of a software team's time is spent dealing with the essential complexity of the tasks they're dealing with. However, due to modularity in modern software design, the achievements of prior software teams may be encapsulated in objects or other library entities, for future software teams to use. So, each generation of software teams is exploring the space of programs that can be built by a small team of humans utilizing all the library entities created by the previous software teams. Thus exponential growth may be possible in spite of the limitation of what can be done by a single small group of developers. This is not just an abstract point, it's what I think we actually see happening in areas like gaming and scientific computing where there is rapid progress.

Tim Tyler said...

The section of Brooks' "No Silver Bullet" article which relates to artificial intelligence seems pretty sucky to me. It does not make a very convincing case. To make a case for this you have to argue that developing intelligent machines increases in difficulty in a way that exceeds or keeps pace with the improvements in development speed they seem likely to bring. Brooks doesn't even attempt to make that case.

Free Software said...

Well it is my good luck in real, as I was searching something else on internet and I am here to your blog by chance and I must say it is a good site buddy.

Anonymous said...

If intuition were to be defined as arriving at a logical conclusion without going through a logical process (emotional guidance system), allowing for real time interaction with the contextual environment, then allowing an abstraction at a higher level, lets call it philosophy, the PH of the D... is nothing more than another tier in the semantic domain, perhaps one level higher than gestalt... QSO

beauty said...

The blog contains informational and educational material. The post enhance my thoughts and experience. So nice!

Stephen Reed said...

I believe that software is improving exponentially. A developer/designer can do more each year for the same amount of typing.

Furthermore, my own experiments with speech recognition & dialog suggest that programmers will be able to interact with their development environments with voice input.

I work primarily as a sole developer and solving programming problems has become much easier in recent years with the knowledge repositories provided by Internet sites such as Stack Overflow.

Joomla Web Design said...

I was able to obtain the wonderful piece of information. We wish to express our gratitude for the contribution.

Anonymous said...

Obat kutil kemaluan
Obat kutil kemaluan
Obat kutil kemaluan
Obat kutil kemaluan
Obat kutil kemaluan

Snehal Harshe said...

Thank you so much for sharing all this wonderful information !!!! It is so appreciated!! You have good humor in your blogs. So much helpful and easy to read! I really appreciate that you are posting about improving software.
Visit Python Training in Pune

3RI Technologies said...

Thanks for sharing an informative blog, very well written and explained and written
Python Training in Pune

JacobHarman said...

Deciding to work with an ASO or Authoritative Administrations Association is the subsequent HR reevaluating situation. An ASO works for your business to offer managerial administrations, as the name properly recommends. These incorporate paying representatives, making direct installments, and submitting finance charges. The ASO will offer help with questions in regards to consistence and legitimate issues, admittance to protection, and specialist's pay, rather than re-appropriated finance. These administrations and costs depend on the quantity of representatives you have and the business gambles with associated with keeping them>> outsourcing of hr functions