Meta-Algorithmics: Patterns for Robust, Low-Cost, High-Quality Systems
β Scribed by Steven J. Simske(auth.)
- Year
- 2013
- Tongue
- English
- Leaves
- 384
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
The confluence of cloud computing, parallelism and advanced machine intelligence approaches has created a world in which the optimum knowledge system will usually be architected from the combination of two or more knowledge-generating systems. There is a need, then, to provide a reusable, broadly-applicable set of design patterns to empower the intelligent system architect to take advantage of this opportunity.
This book explains how to design and build intelligent systems that are optimized for changing system requirements (adaptability), optimized for changing system input (robustness), and optimized for one or more other important systemΒ parameters (e.g., accuracy, efficiency, cost). It provides an overview of traditional parallel processing which is shown to consist primarily of task and component parallelism; before introducing meta-algorithmic parallelism which is based on combining two or more algorithms, classification engines or other systems.
Key features:
- Explains the entire roadmap for the design, testing, development, refinement, deployment and statistics-driven optimization of building systems for intelligence
- Offers an accessible yet thorough overview of machine intelligence, in addition to having a strong image processing focus
- Contains design patterns for parallelism, especially meta-algorithmic parallelism β simply conveyed, reusable and proven effective that can be readily included in the toolbox of experts in analytics, system architecture, big data, security and many other science and engineering disciplines
- Connects algorithms and analytics to parallelism, thereby illustrating a new way of designing intelligent systems compatible with the tremendous changes in the computing world over the past decade
- Discusses application of the approaches to a wide number of fields; primarily, document understanding, image understanding, biometrics and security printing
- Companion website contains sample code and data sets
Content:
Chapter 1 Introduction and Overview (pages 1β41):
Chapter 2 Parallel Forms of Parallelism (pages 42β72):
Chapter 3 Domain Areas: Where are These Relevant? (pages 73β103):
Chapter 4 Applications of Parallelism by Task (pages 104β136):
Chapter 5 Application of Parallelism by Component (pages 137β174):
Chapter 6 Introduction to Meta?Algorithmics (pages 175β240):
Chapter 7 First?Order Meta?Algorithmics and their Applications (pages 241β271):
Chapter 8 Second?Order Meta?Algorithmics and their Applications (pages 272β309):
Chapter 9 Third?Order Meta?Algorithmics and their Applications (pages 310β341):
Chapter 10 Building More Robust Systems (pages 342β359):
Chapter 11 The Future (pages 360β368):
π SIMILAR VOLUMES
<p>The confluence of cloud computing, parallelism and advanced machine intelligence approaches has created a world in which the optimum knowledge system will usually be architected from the combination of two or more knowledge-generating systems. There is a need, then, to provide a reusable, broadly
<P><STRONG>Design for Manufacturability: How to Use Concurrent Engineering to Rapidly Develop Low-Cost, High-Quality Products for Lean Production</STRONG> shows how to use concurrent engineering teams to design products for all aspects of manufacturing with the lowest cost, the highest quality, and
<p><i><p><p>Achieve any cost goals in half the time and achieve stable production with quality designed in right-the-first-time.</i>γ</p><b><p><p>Design for Manufacturability: How to Use Concurrent Engineering to Rapidly Develop Low-Cost, High-Quality Products for Lean Production </b>is still the de
The first comprehensive and in-depth guide to microvia and wafer level chip scale package (WLCSP) technologies. This reference gives you cutting edge information on the most important developments and latest research results in applying the microvia and WLCSP technologies to low-cost and high-densit
Robust statistics is the study of designing estimators that perform well even when the dataset significantly deviates from the idealized modeling assumptions, such as in the presence of model misspecification or adversarial outliers in the dataset. The classical statistical theory, dating back to pi