Computer Science; Information Systems
Metacomputing is the use of computing to study and design solutions to complex problems. These problems can range from how best to design large-scale computer networking systems to how to determine the most efficient method for performing mathematical operations involving very large numbers. In essence, metacomputing is computing about computing. Metacomputing makes it possible to perform operations that individual computers, and even some supercomputers, could not handle alone.
The field of metacomputing arose during the 1980s. Researchers began to realize that the rapid growth in networked computer systems would soon make it difficult to take advantage of all interconnected computing resources. This could lead to wasted resources unless an additional layer of computing power were developed. This layer would not work on a computational problem itself; instead, it would determine the most efficient method of addressing the problem. In other words, researchers saw the potential for using computers in a manner so complicated that only a computer could manage it. This new metacomputing layer would rest atop the middle computing layer that works on research tasks. It would ensure that the middle layer makes the best use of its resources and that it approaches calculations as efficiently as possible.
One reason an application may need a metacomputing layer is the presence of complexities. Complexities are elements of a computational problem that make it more difficult to solve. Domain-dependent complexities arise due to the context of the computation. For example, when calculating the force and direction necessary for an arrow to strike its target, the effects of wind speed and direction would be a domain-dependent complexity. Meta-complexities are those that arise due to the nature of the computing problem rather than its context. An example of a meta-complexity is a function that has more than one possible solution.
Metacomputing is frequently used to make it possible to solve complex calculations by networking between many different computers. The networked computers can combine their resources so that each one works on part of the problem. In this way, they become a virtual supercomputer with greater capabilities than any individual machine. One successful example of this is a project carried out by biochemists studying the way proteins fold and attach to one another. This subject is usually studied using computer programs that model the proteins’ behavior. However, these programs consume a lot of time and computing power. Metacomputing allowed the scientists to create a game that users all over the world can play that generates data about protein folding at the same time. Users try to fit shapes together in different ways, contributing their time and computing power to the project in a fun and easy way.
Another trend with great potential for metacomputing is ubiquitous computing, meaning computing that is everywhere and in everything. As more and more mundane devices are equipped with Internet-connected microprocessors, from coffee makers to cars to clothing, there is the potential to harness this computing power and data. For example, a person might use multiple devices to monitor different aspects of their health, such as activity level, water intake, and calories burned. Metacomputing could correlate all of this independently collected information and analyze it. This data could then be used to diagnose potential diseases, recommend lifestyle changes, and so forth.
One form of ubiquitous metacomputing that already exists is the way that various smartphone applications use location data to describe and predict traffic patterns. One person's data cannot reveal much about traffic. However, when many people's location, speed, and direction are reported simultaneously, the data can be used to predict how long one person's commute will be on a given morning.
Metacomputing is often used when there is a need for a computer system that can “learn.” A system that can learn is one that can analyze its own performance to make adjustments to its processes and even its architecture. These systems bear a strong resemblance to the operation of the human brain. In some cases they are intentionally designed to imitate the way the brain approaches problems and learns from its past performance. Metacomputing, in this sense, is not too dissimilar from metacognition.
Metacomputing sometimes conjures up fears of the dangers posed by artificial intelligence in science fiction. In reality, metacomputing is just another category of computer problem to be solved, not the beginning of world domination by machines. Humans can conceive of computer problems so complex that it is nearly impossible to solve them without the aid of another computer. Metacomputing is simply the solution to this dilemma.
—Scott Zimmer, JD
Loo, Alfred Waising, ed. Distributed Computing Innovations for Business, Engineering, and Science. Hershey: Information Science Reference, 2013. Print.
Mallick, Pradeep Kumar, ed. Research Advances in the Integration of Big Data and Smart Computing. Hershey: Information Science Reference, 2016. Print.
Mason, Paul. Understanding Computer Search and Research. Chicago: Heinemann, 2015. Print.
Nayeem, Sk. Md. Abu, Jyotirmoy Mukhopadhyay, and S. B. Rao, eds. Mathematics and Computing: Current Research and Developments. New Delhi: Narosa, 2013. Print.
Segall, Richard S., Jeffrey S. Cook, and Qingyu Zhang, eds. Research and Applications in Global Supercomputing. Hershey: Information Science Reference, 2015. Print.
Tripathy, B. K., and D. P. Acharjya, eds. Global Trends in Intelligent Computing Research and Development. Hershey: Information Science Reference, 2014. Print.