The 21 Laws of Computer
21 LAWS OF COMPUTER
The following are some Laws governing computer as provided by some computer scientists over time.
Amdahl’s Law:
Amdahl’s Law:
The speed-up achievable on a parallel computer can be significantly limited by the existence of a small fraction of inherently sequential code which cannot be paralyzed.
Copes Law:
There is a general tendency towards size increase in evolution. (Edward Drinker Cope)
Ellison’s Law:
The user base for strong cryptography declines by half with every additional keystroke or mouseclick required to make it work. (Carl Ellison)
Brooks Law: Adding manpower to a late software project makes it later. (Frederick Brooks Jr)
- A robot may not injure humanity, or through inaction, allow humanity to come to harm.
- A robot must obey orders given to it by a human being except where such orders would conflict with the first law.
- A robot must protect it’s own existence as long as such protection does not conflict with the first or second law.
Isaac Asimov's Zeroth Law of Robotics:
A robot may not injure humanity, or through inaction, allow humanity to come to harm.
Tesler’s Theorem:
Airtificial intelligence is whatever hasn’t been done yet.
Wirth’s Law:
Software gets slower faster than hardware gets faster. (Nicklaus Wirth)
Weinberg’s Law:
If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization. ( Gerald M. Weinberg)
Church – Turing Thesis:
Every function which will naturally be regarded as computable can be computed by the universal Turing machine.
Conway’s Law:
If you have four groups working on a compiler, you’ll get a 4-pass compiler.
Augustine’s second law of socio-science:
For every scientific (or engineering) action, there is an equal and opposite social reaction. (Norman Augustine)
Dilbert Principle:
The most ineffective workers are systematically moved to the place where they can do the least damage: management. (Scott Adams)
Benford’s Law:
Passion is inversely proportional to the amount of real information available. (Gregory Benford)
Clarke’s First Law:
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong. (Arthur C. Clarke)
Clarke’s Second Law:
The only way of discovering the limits of the possible is to venture a little way past them into the impossible. (Arthur C. Clarke)
Clarke’s Third Law:
Any sufficiently advanced technology is indistinguishable from magic. (Arthur C. Clarke)
Tesler’s Law of Conservation of Complexity:
You cannot reduce the complexity of a given task beyond a certain point. Once you’ve reached that point, you can only shift the burden around. (Larry Tesler)
Deutsch’s Seven Fallacies of Distributed Computing:
Reliable delivery; Zero latency; Infinite bandwidth; Secure transmissions; Stable topology; Single administrator; Zero cost. ( Peter Deutsch)
Weibull’s Power Law:
The algorithm of failure rates increases linearly with the logarithm of age. (Waloddi Weibull)
Zawinski’s Law:
Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.
Comments