Thursday, December 01, 2005

Biology and Computing

When I got involved in computational science many years ago, the power, or lack thereof, of computing was one of the most frustrating parts of doing research. Getting time on centralized supercomputing resources was never easy and to solve problems, one had to make many approximations, just to make the solutions computationally feasible. Things started changing in the late 90's when Linux clusters started becoming feasible, robust and cheap. Suddenly, large computational resources were available to almost anyone and without that the whole genomics/bioinformatics boom would never have happened. I think the field is ready for the next "big thing". Routine calculations on large systems can now be done on commodity machines. With Microsoft jumping into the fray with their CCS beta, it is evident that HPC has firmly arrived. Specialized hardware (e.g. blue gene) is being used for complex calculations on living and non-living systems. That said, I think the industry is ready for either the next big leap in computing architectures that will allow the complex calculations that require a blue gene to become more accessible, or a change in usage paradigm, which will allow improved, easier access to these specialized resources. On-demand computing for scientific and technical applications is a hitherto untapped paradigm that can, in theory, bring high-performance computing to a larger set of users. As we seek to unlock the mysteries of cellular machinery, regulatory networks, nanotechnology and many other fascinating fields, a radical change in how computing is accessed or the hardware that we use is needed, and hopefully not too far away.


Post a Comment

Links to this post:

Create a Link

<< Home