Mcs-014 Solved Assignment 2012 Electoral Votes

Classification is a data mining technique used to predict group membership for data instances whose classes are unknown [35]. Pattern classification involves building a function that maps the input feature space to an output space of two or more than two classes. Decision trees, Bayesian models, and artificial neural networks (ANNs) are examples of effective classifiers in the field of pattern classification [29]. However, by the No Free Lunch theorem, there is no single classifier that can be considered optimal for all tasks. In an attempt to improve recognition performance of a single classifier, a common approach is to combine various classifiers forming what is called a multi-classifier systems (MCS) [36].

There are several reasons for combining multiple classifiers to solve a given learning task [29]. First, MCS, also known as ensembles or committees, exploits the idea that a pool of different classifiers, referred as experts, can offer complementary information about the patterns to be classified, improving the effectiveness of the overall recognition process. Second, in some cases, ensembles might not be better than the single best classifier but can diminish or eliminate the risk of picking an inadequate single classifier. Another reason for ensembles arises from limited representational capability of learning algorithms. It is possible that the classifier space considered for the task does not contain the optimal classifier.

Successful applications of MCS have been reported in various works in the literature, such as handwritten digit recognition [13], signature verification [5], image labeling [42], just to name a few. Basically, most of these systems may take two approaches: selection and fusion (SF) [46]. In the classifier fusion, every classifier in the ensemble is used. Bagging [8], boosting [41], and the random subspace method (RSM) [22] are frequently used for the generation of members, while arithmetic rule (e.g., maximum, mean, median, minimum, product), majority vote or the use of another classifier are examples of strategies used to combine their decisions [37]. In classifier selection, each ensemble member is supposed to know well a part of the feature space and be responsible for objects in this part [29].

Most of the discussions and design methodologies of ensembles are devoted to fusion version and are concerned with how to achieve good performance by creating diversity measure and combination schema. Research is less common in selection methodology. Basically, for the selection scheme, a means of partitioning the feature space and estimating the performance of each classifier are required. In Woods’ DCS-LA approach [46], for example, the classification accuracy is estimated in small regions of feature space surrounding an unknown test sample, and then the most locally accurate classifier is nominated to make the final decision. On the other hand, Kuncheva [27] presents an algorithm where the training data are clustered to form the decision regions, and a confidence interval is used to determine whether one or multiple classifiers should be used to make a final decision.

Non-automatic design of ensembles often involves a tedious trial-and-error process which might be appropriate where prior knowledge and an experienced expert are available, which might not be the case for many real-world tasks and are hard to find in practice [2]. The goal of ensembles design is to determine ensemble architectures automatically. In this paper, we report the performance of a novel automatic method, named SFJADE, to combine selection and fusion (SF) via adaptive differential evolution (JADE). JADE is a powerful stochastic real-parameter optimization algorithm in current use [16]. For the clustering phase, self-organizing maps (SOM) was chosen since it is a simple technique that has good performance [12]. For the classification phase, the attractiveness of ANNs stems from their many inherent characteristics, including nonlinearity, high parallelism, robustness, fault tolerance, learning, and their capability to generalize [21].

This paper is organized as follows. Section 2 gives the theoretical justification of selection and fusion, by means of clustering algorithms; Sect. 3 presents some related works, that show strengths and weaknesses in the development of ensembles; Sect. 4 presents the evolutionary algorithm used in the current work; Sect. 5 describes the basic idea of proposed methodology; Sect. 6 presents the experimental results; Finally, Sect. 7 presents some final considerations about the main topics covered in this work, including contributions reached and directions for future works.

Book is available in English & Hindi Medium


Block- 1 Consumer Behaviour
Unit-1 Theory of Consumer Behaviour: Basic Themes
Unit-2 Theory of Demand : An Alternative Approaches
Unit-3 Recent Developments of Demand Theory

Block- 2 Producer Behaviour
Unit-4 Theory of Production
Unit-5 Theory of Cost
Unit-6 Production Economics

Block- 3 Price and Output Determination - I
Unit-7 Perfect Competition
Unit-8 Monopoly
Unit-9 Monopolistic Competition

Block- 4 Price and Output Determination-II
Unit-10 Non-Collusive Oligopoly
Unit-11 Collusive Oligopoly
Unit-12 Alternative Theory of Firm-I
Unit-13 Alternative Theory of Firm-II

Block- 5 Welfare Economics
Unit-14 Pigovian vs. Paretain Approach
Unit-15 Social Welfare Function
Unit-16 Imperfect Market, Externality and Public Goods
Unit-17 Social Choice and Welfare

Block- 6 General Equilibrium
Unit-18 Partial and General Equilibrium Approaches: Pure Exchange Model
Unit-19 Production without Consumption

Block- 7 Economics of Uncertainty
Unit-20 Choice in Uncertain Situations
Unit-21 Insurance Choice and Risk
Unit-22 Economics of Information

Block- 8 Non-Cooperative Game Theory
Unit-23 Modeling Competitive Situations
Unit-24 Solution Concepts of Non-Cooperative Games
Unit-25 Repeated Games
Unit-26 Games of Incomplete Information

1. Solution Paper - June 2010
2. Solution Paper - Dec 2010
3. Solution Paper - June 2011
4. Solution Paper - Dec 2011
5. Solution Paper - June 2012
6. Solution Paper - Dec 2012

MEC-001 Microeconomic Analysis in English

Rs. 180 (Minimum discount 40%) and Postage Charges Extra
No of Pages: 248
Author : Sant Kumar
ISBN : 978-93-81066-59-1.

MEC-001 Microeconomic Analysis in Hindi
Rs. 180 (Minimum discount 40%) and Postage Charges Extra
No of Pages: 376
Author :
GPH Panel of Expert
ISBN : 978-93-81970-48-5.

View PDF

Yes! I am interested

0 thoughts on “Mcs-014 Solved Assignment 2012 Electoral Votes”


Leave a Comment

Your email address will not be published. Required fields are marked *