Meta-Algorithmics: Patterns for Robust, Low Cost, High Quality Systems

Meta-Algorithmics: Patterns for Robust, Low Cost, High Quality Systems

Steven J. Simske

Language: English

Pages: 386

ISBN: 1118343360

Format: PDF / Kindle (mobi) / ePub


The confluence of cloud computing, parallelism and advanced machine intelligence approaches has created a world in which the optimum knowledge system will usually be architected from the combination of two or more knowledge-generating systems. There is a need, then, to provide a reusable, broadly-applicable set of design patterns to empower the intelligent system architect to take advantage of this opportunity.

This book explains how to design and build intelligent systems that are optimized for changing system requirements (adaptability), optimized for changing system input (robustness), and optimized for one or more other important system  parameters (e.g., accuracy, efficiency, cost). It provides an overview of traditional parallel processing which is shown to consist primarily of task and component parallelism; before introducing meta-algorithmic parallelism which is based on combining two or more algorithms, classification engines or other systems.

Key features:

  • Explains the entire roadmap for the design, testing, development, refinement, deployment and statistics-driven optimization of building systems for intelligence
  • Offers an accessible yet thorough overview of machine intelligence, in addition to having a strong image processing focus
  • Contains design patterns for parallelism, especially meta-algorithmic parallelism – simply conveyed, reusable and proven effective that can be readily included in the toolbox of experts in analytics, system architecture, big data, security and many other science and engineering disciplines
  • Connects algorithms and analytics to parallelism, thereby illustrating a new way of designing intelligent systems compatible with the tremendous changes in the computing world over the past decade
  • Discusses application of the approaches to a wide number of fields; primarily, document understanding, image understanding, biometrics and security printing
  • Companion website contains sample code and data sets

 

 

 

 

 

 

 

 

 

 

distinguished from the second-order patterns, however, in the tight coupling between the multiple steps in the algorithm. As such, the analysis tools—feedback, sensitivity analysis, regional optimization, and hybridization being the primary ones—tightly couple not only one step to the next but also connect the downstream steps back to the earlier steps. Nowhere is this more evident than in the first third-order metaalgorithmic pattern: the simple Feedback pattern, in which errors in the

the imbalance in the expected workflow itself. A very balanced expected workflow (Table 2.1) is much more resilient to variability in the actual workflow. 2.3.2 Application to Data Mining, Search, and Other Algorithms Data mining and search are tasks well suited to parallelism by component. Text mining algorithms, for example, begin with word counts. Word-forming rules are used to decide, for example, whether to stem or lemmatize. Lemmatization is the grouping together of the inflected forms of

digitization, analysis, interpretation, classification, and so on—are evaluated, the individual generators tend to make mistakes on different types of content. These differences are a legacy of the differences in how those generators were created, tested, deployed, changed, and upgraded over the years. These differences are often used by their owners to highlight the advantages of one generator over its competitors, which from a financial standpoint is both expected and rational. However, from a

3-channel color images of natural scenes, it was found that the relative processing time of each of these three factors is given by the following: 1. Subimage convolution overhead ratio = 1.01 = k1 2. Subimage formation and reversal overhead = 0.07 = k2 3. Relative processing time =1.00 = k3. Using these three values {k1 , k2 , k3 }, the overall performance timing (OPT) can be exactly calculated from OPT = k1 × (subimage convolution overhead ratio – 1.0) + k2 × (subimage formation and reversal

and other intelligent systems as their starting points. Meta-algorithmics combine multiple models to make better decisions, meaning that, for example, bagging, boosting, stacked generalization, and random subspace methods, could all be used together to create a more accurate, more robust, and/or more cost-effective system. 1.6 Machine Learning/Intelligence The distinction between machine learning/intelligence and artificial intelligence is somewhat arbitrary. Here, I have decided to term those

Download sample

Download