From: Walden H. Leverich
Which is a more closely guarded secret, Google's search
algorithm, or their system's management process?
Yeah, and the answer is that they closely guard their system's management process, apparently. But that supports my original hyphothesis that Web servers and application servers are not well equipped to manage intense workloads, themselves. Evidently, it takes a human element. That's what makes this discussion interesting. What are the things that humans do to tune performance?
That reminds me of Frank Soltis book, Fortress Rochester, which goes into some detail about IBM i task dispatching & workload management, which was interesting to me. Frank often understates his position, but he acknowledged that IBM i task dispatching & workload management are arguably the best in the world. The database is built-in to the OS. Low-level routines know precisely what every CPU is doing, and task dispatching is geared to maximize throughput for business workloads. Threads retain processor affinity, and so forth. You simply don't find that level of integration under distributed, multi-tier architectures - where request dispatching is handled by external load-balancers - where a database is on a remote platform - application servers in-between.
Nathan.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact
[javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.