Showing 1 to 100 of 127 matching Articles
Results per page:
Export (CSV)
By
Coello, Carlos A. Coello
Post to Citeulike
4 Citations
Summary
In this paper, we will briefly discuss the current state of the research on evolutionary multiobjective optimization, emphasizing the main achievements obtained to date. Achievements in algorithmic design are discussed from its early origins until the current approaches which are considered as the “second generation” in evolutionary multiobjective optimization. Some relevant applications are discussed as well, and we conclude with a list of future challenges for researchers working (or planning to work) in this area in the next few years.
more …
By
DomínguezMedina, Christian; CruzCortés, Nareli
Post to Citeulike
2 Citations
Wireless Sensor Networks have become an active research topic in the last years. The routing problem is a very important part in this kind of networks that need to be considered in order to maximize the network life time. As the size of the network increases, routing becomes more complex due the amount of sensor nodes in the network. Sensor nodes in Wireless Sensor Networks are very constrained in memory capabilities, processing power and batteries. Ant Colony Optimization based routing algorithms have been proposed to solve the routing problem trying to deal with these constrains. We present a comparison of two Ant Colonybased routing algorithms, taking into account current amounts of energy consumption under different scenarios and reporting the usual metrics for routing in wireless sensor networks.
more …
By
Gaxiola, Fernando; Melin, Patricia; Valdez, Fevrier; Castillo, Oscar
Show all (4)
Post to Citeulike
1 Citations
This paper presents a new modular neural network architecture that is used to build a system for pattern recognition based on the iris biometric measurement of persons. In this system, the properties of the person iris database are enhanced with image processing methods, and the coordinates of the center and radius of the iris are obtained to make a cut of the area of interest by removing the noise around the iris. The inputs to the modular neural network are the processed iris images and the output is the number of the identified person. The integration of the modules was done with a type2 fuzzy integrator at the level of the sub modules, and with a gating network at the level of the modules.
more …
By
Sánchez, Daniela; Melin, Patricia; Castillo, Oscar
Post to Citeulike
In this paper we propose a new model of a Modular Neural Network (MNN) with fuzzy integration based on granular computing. The topology and parameters of the model are optimized with a Hierarchical Genetic Algorithm (HGA). The model was applied to the case of human recognition to illustrate its applicability. The proposed method is able to divide the data automatically into sub modules, to work with a percentage of images and select which images will be used for training. We considered, to test this method, the problem of human recognition based on ear, and we used a database with 77 persons (with 4 images each person for this task).
more …
By
Rodríguez, Lisbeth; Li, Xiaoou; MejíaAlvarez, Pedro
Post to Citeulike
2 Citations
Vertical partitioning is a well known technique to improve query response time in relational databases. This consists in dividing a table into a set of fragments of attributes according to the queries run against the table. In dynamic systems the queries tend to change with time, so it is needed a dynamic vertical partitioning technique which adapts the fragments according to the changes in query patterns in order to avoid long query response time. In this paper, we propose an active system for dynamic vertical partitioning of relational databases, called DYVEP (DYnamic VErtical Partitioning). DYVEP uses active rules to vertically fragment and refragment a database without intervention of a database administrator (DBA), maintaining an acceptable query response time even when the query patterns in the database suffer changes. Experiments with the TPCH benchmark demonstrate efficient query response time.
more …
By
SanchezDiaz, Guillermo; PizaDavila, Ivan; LazoCortes, Manuel; MoraGonzalez, Miguel; SalinasLuna, Javier
Show all (5)
Post to Citeulike
3 Citations
Typical testors are a useful tool for both feature selection and for determining feature relevance in supervised classification problems. Nowadays, generating all typical testors of a training matrix is computationally expensive; all reported algorithms have exponential complexity, depending mainly on the number of columns in the training matrix. For this reason, different approaches such as sequential and parallel algorithms, genetic algorithms and hardware implementations techniques have been developed. In this paper, we introduce a fast implementation of the algorithm CT_EXT (which is one of the fastest algorithms reported) based on an accumulative binary tuple, developed for generating all typical testors of a training matrix. The accumulative binary tuple implemented in the CT_EXT algorithm, is a useful way to simplifies the search of feature combinations which fulfill the testor property, because its implementation decreases the number of operations involved in the process of generating all typical testors. In addition, experimental results using the proposed fast implementation of the CT_EXT algorithm and the comparison with other state of the art algorithms that generated typical testors are presented.
more …
By
MújicaVargas, Dante; GallegosFunes, Francisco Javier; RosalesSilva, Alberto J.; CruzSantiago, Rene
Show all (4)
Post to Citeulike
1 Citations
Image segmentation is a key step for many images analysis applications. So far, there does not exist a general method to segment suitable all images, regardless if these are corrupted or noise free. In this paper, we propose to modify the Fuzzy Cmeans clustering algorithm and the FCM_S1 variant by using the RMLestimator. The idea to our method is to get robust clustering algorithms able to segment images with different type and levels of noises. The performance of the proposed algorithms is tested on synthetic and real images. Experimental results show that the proposed algorithms are more robust to the noise presence and more effective than the comparative algorithms.
more …
By
Flores, Juan J.; Rodriguez, Hector; Graff, Mario
Post to Citeulike
2 Citations
Evolutionary design of time series predictors is a field that has been explored for several years now. The levels of design vary in the many works reported in the field. We decided to perform a complete design and training of ARIMA models using Evolutionary Computation. This decision leads to high dimensional search spaces, whose size increases exponentially with dimensionality. In order to reduce the size of those search spaces we propose a method that performs a preliminary statistical analysis of the inputs involved in the model design and their impact on quality of results; as a result of the statistical analysis, we eliminate inputs that are irrelevant for the prediction task. The proposed methodology proves to be effective and efficient, given that the results increase in accuracy and the computing time required to produce the predictors decreases.
more …
By
Gomez, Laura E.; Sossa, Humberto; Barron, Ricardo; Jimenez, Julio F.
Show all (4)
Post to Citeulike
A new method for the retrieval of melodies from a database is described in this paper. For its functioning, the method makes use of Dynamic Neural Networks (DNN). During training a set ofDNN is first trained with information of the melodies to be retrieved. Instead of using traditional signal descriptors we use the matrix of synaptic weights that can be efficiently used for melody representation and retrieval. Most of the reported works have been focused on the symbolic representation of musical information. None of them have provided good results with original signals.
more …
By
Colores, Juan M.; GarcíaVázquez, Mireya; RamírezAcosta, Alejandro; PérezMeana, Héctor
Show all (4)
Post to Citeulike
5 Citations
During video acquisition of an automatic noncooperative biometric iris recognition system, not all the iris images obtained from the video sequence are suitable for recognition. Hence, it is important to acquire high quality iris images and quickly identify them in order to eliminate the poor quality ones (mostly defocused images) before the subsequent processing. In this paper, we present the results of a comparative analysis of four methods for iris image quality assessment to select clear images in the video sequence. The goal is to provide a solid analytic ground to underscore the strengths and weaknesses of the most widely implemented methods for iris image quality assessment. The methods are compared based on their robustness to different types of iris images and the computational effort they require. The experiments with the built database (100 videos from MBGC v2) demonstrate that the best performance scores are generated by the kernel proposed by Kang & Park. The FAR and FRR obtained are 1.6% and 2.3% respectively.
more …
By
Jiménez Vargas, Sergio; Gelbukh, Alexander
Post to Citeulike
1 Citations
Soft cardinality (SC) is a softened version of the classical cardinality of set theory. However, given its prohibitive cost of computing (exponential order), an approximation that is quadratic in the number of terms in the text has been proposed in the past. SC Spectra is a new method of approximation in linear time for text strings, which divides text strings into consecutive substrings (i.e., qgrams) of different sizes. Thus, SC in combination with resemblance coefficients allowed the construction of a family of similarity functions for text comparison. These similarity measures have been used in the past to address a problem of entity resolution (name matching) outperforming SoftTFIDF measure. SC spectra method improves the previous results using less time and obtaining better performance. This allows the new method to be used with relatively large documents such as those included in classic information retrieval collections. SC spectra method exceeded SoftTFIDF and cosine tfidf baselines with an approach that requires no term weighing.
more …
By
KuriMorales, Angel
Post to Citeulike
Structured Data Bases which include both numerical and categorical attributes (Mixed Databases or MD) ought to be adequately preprocessed so that machine learning algorithms may be applied to their analysis and further processing. Of primordial importance is that the instances of all the categorical attributes be encoded so that the patterns embedded in the MD be preserved. We discuss CESAMO, an algorithm that achieves this by statistically sampling the space of possible codes. CESAMO’s implementation requires the determination of the moment when the codes distribute normally. It also requires the approximation of an encoded attribute as a function of other attributes such that the best code assignment may be identified. The MD’s categorical attributes are thusly mapped into purely numerical ones. The resulting numerical database (ND) is then accessible to supervised and nonsupervised learning algorithms. We discuss CESAMO, normality assessment and functional approximation. A case study of the US census database is described. Data is made strictly numerical using CESAMO. Neural Networks and SelfOrganized Maps are then applied. Our results are compared to classical analysis. We show that CESAMO’s application yields better results.
more …
By
López, Marco A.; MarcialRomero, J. Raymundo; Ita, Guillermo; Valdovinos, Rosa M.
Show all (4)
Post to Citeulike
In this paper we present an implementation (markSAT) for computing #2SAT via graph transformations. For that, we transform the input formula into a graph and test whether it is which we call a cactus graph. If it is not the case, the formula is decomposed until cactus subformulas are obtained. We compare the efficiency of markSAT against sharpSAT which is the leading sequential algorithm in the literature for computing #SAT obtaining better results with our proposal.
more …
By
Naredo, Enrique; Castillo, Oscar
Post to Citeulike
5 Citations
We describe the use of Ant Colony Optimization (ACO) for the ball and beam control problem, in particular for the problem of tuning a fuzzy controller of the Sugeno type. In our case study the controller has four inputs, each of them with two membership functions; we consider the intersection point for every pair of membership functions as the main parameter and their individual shape as secondary ones in order to achieve the tuning of the fuzzy controller by using an ACO algorithm. Simulation results show that using ACO and coding the problem with just three parameters instead of six, allows us to find an optimal set of membership function parameters for the fuzzy control system with less computational effort needed.
more …
By
ZatarainCabada, Ramón; BarrónEstrada, María Lucia; RíosFélix, José Mario
Post to Citeulike
The growing demand for software tools that encourage and support students in learning design and algorithm implementation, has allowed the creation of such software systems. In this paper we present a new and innovative affective tutoring system, for logic and algorithmic programming, based on block techniques. Our approach combines the Google Blockly’s interface with gamification techniques and exercises that are monitored according to the emotional state of the student. Depending on the expressed emotion (boring, engagement, frustration, and neutral), the system evaluates a number of variables to determine whether the student requires assistance. Tests have shown that the detection of the emotional state of the student, affect favorably the student evaluations.
more …
By
PragaAlejo, Rolando J.; TorresTreviño, Luis M.; González, David S.; AcevedoDávila, Jorge; Cepeda, Francisco
Show all (5)
Post to Citeulike
Neural Networks (NNs) have been widely used in many industrial processes for prediction and optimization and they have been proven to be useful tools for explaining complex processes. The main objective of this work consists of improving the accuracy of a Radial Basis Function Neural Network Redesigned by Genetic Algorithm and Mahalanobis distance for predicting a welding process. The evaluation function in this approach considers the use of the Coefficient of Determination R^{2}. The results indicated that the statistical method R^{2} is a good alternative to validate the efficiency of the Neural Network model. The principal conclusion in this work is that the Radial Basis Function Redesigned by Genetic Algorithm and Mahalanobis distance had a very good performance in a real case, considering the prediction of specific responses in a welding process.
more …
By
FloresPulido, Leticia; Starostenko, Oleg; RodríguezGómez, Gustavo; PortillaFlores, Alberto; MoraLumbreras, Marva Angelica; AlboresVelasco, Francisco Javier; Sánchez, Marlon Luna; Cuamatzi, Patrick Hernández
Show all (8)
Post to Citeulike
In this paper, the analysis of similarity metrics used for performance evaluation of image retrieval frameworks is provided. Image retrieval based on similarity metrics obtains remarkable results in comparison with robust discrimination methods. Thus, the similarity metrics are used in matching process between visual query from user and descriptors of images in preprocessed collection. In contrast, the discrimination methods usually compare feature vectors computing distances between visual query and images in collections. In this research, a behavior of spline radial basis function used as metric for image similarity measurement is proposed and evaluated, comparing it with discrimination methods, particularly with general principal component analysis algorithm (GPCA). Spline radial basis function has been tested in image retrieval using a standard image collections, such as COIL100. The obtained results using spline radial basis function report 88% of correct image retrieval avoiding a classification phase required in other wellknown methods. The discussion of tests with designed Image Data Segmentation with Spline (IDSS) framework illustrates optimization and improvement of image retrieval process.
more …
By
Lukin, Vladimir; Ponomarenko, Nikolay; Kurekin, Andrey; Pogrebnyak, Oleksiy
Show all (4)
Post to Citeulike
1 Citations
Several main practical tasks, important for effective preprocessing of multichannel remote sensing (RS) images, are considered in order to reliably retrieve useful information from them and to provide availability of data to potential users. First, possible strategies of data processing are discussed. It is shown that one problem is to use more adequate models to describe the noise present in real images. Another problem is automation of all or, at least, several stages of data processing, like determination of noise type and its statistical characteristics, noise filtering and image compression before applying classification at the final stage. Second, some approaches that are effective and are able to perform well enough within automatic or semiautomatic frameworks for multichannel images are described and analyzed. The applicability of the proposed methods is demonstrated for particular examples of real RS data classification.
more …
By
Hernández, Paula; Gómez, Claudia; Cruz, Laura; Ochoa, Alberto; Castillo, Norberto; Rivera, Gilberto
Show all (6)
Post to Citeulike
The computational optimization field defines the parameter tuning problem as the correct selection of the parameter values in order to stabilize the behavior of the algorithms. This paper deals the parameters tuning in dynamic and largescale conditions for an algorithm that solves the Semantic Query Routing Problem (SQRP) in peertopeer networks. In order to solve SQRP, the HH_AdaNAS algorithm is proposed, which is an ant colony algorithm that deals synchronously with two processes. The first process consists in generating a SQRP solution. The second one, on the other hand, has the goal to adjust the Time To Live parameter of each ant, through a hyperheuristic. HH_AdaNAS performs adaptive control through the hyperheuristic considering SQRP local conditions. The experimental results show that HH_AdaNAS, incorporating the techniques of parameters tuning with hyperheuristics, increases its performance by 2.42% compared with the algorithms to solve SQRP found in literature.
more …
By
ValdezRodríguez, José E.; Calvo, Hiram; FelipeRiverón, Edgardo M.
Post to Citeulike
Depth reconstruction from single images has been a challenging task due to the complexity and the quantity of depth cues that images have. Convolutional Neural Networks (CNN) have been successfully used to reconstruct depth of general object scenes; however, these works have not been tailored for the particular problem of road perspective depth reconstruction. As we aim to build a computational efficient model, we focus on singlestage CNNs. In this paper we propose two different models for solving this task. A particularity is that our models perform refinement in the same singlestage training; thus, we call them ReduceRefineUpsample (RRU) models because of the order of the CNN operations. We compare our models with the current state of the art in depth reconstruction, obtaining improvements in both global and local views for images of road perspectives.
more …
By
Martínez, Luis G.; Castro, Juan R.; Licea, Guillermo; RodríguezDíaz, Antonio
Show all (4)
Post to Citeulike
In psychology projective tests are interpretative and subjective obtaining results based on the eye of the beholder, they are widely used because they yield rich and unique data and are very useful. Because measurement of drawing attributes have a degree of uncertainty it is possible to explore a fuzzy model approach to better assess interpretative results. This paper presents a study of the tree projective test applied in software development teams as part of RAMSET’s (Role Assignment Methodology for Software Engineering Teams) methodology to assign specific roles to work in the team; using a TakagiSugenoKang (TSK) Fuzzy Inference System (FIS) and also training data applying an ANFIS model to our case studies we have obtained an application that can help in role assignment decision process recommending best suited roles for performance in software engineering teams.
more …
By
Valdez, Fevrier; Melin, Patricia; Castillo, Oscar
Post to Citeulike
2 Citations
We describe in this paper an approach for mathematical function optimization using fuzzy logic for parameter tuning combining Particle Swarm Optimization (PSO) and Genetic Algorithms (GAs). The proposed method combines the advantages of PSO and GA to give us an improved FPSO+FGA hybrid method. Fuzzy logic is helpful to find the optimal parameters in PSO and GA in the best way possible. Also, with the tuning of parameters based on fuzzy logic it is possible to balance the exploration and exploitation of the proposed method. The hybrid method is called FPSO+FGA and was tested with a set of benchmark mathematical functions.
more …
By
Ríos Gaona, Miguel Angel; Gelbukh, Alexander; Bandyopadhyay, Sivaji
Post to Citeulike
1 Citations
We present our experiments on Recognizing Textual Entailment based on modeling the entailment relation as a classification problem. As features used to classify the entailment pairs we use a symmetric similarity measure and a nonsymmetric similarity measure. Our system achieved an accuracy of 66% on the RTE3 development dataset (with 10fold cross validation) and accuracy of 63% on the RTE3 test dataset.
more …
By
Zulfiqar, Ali; Muhammad, Aslam; MartinezEnriquez, Ana Maria; EscaladaImaz, G.
Show all (4)
Post to Citeulike
4 Citations
Every feature extraction and modeling technique of voice/speech is not suitable in all type of environments. In many real life applications, it is not possible to use all type of feature extraction and modeling techniques to design a single classifier for speaker identification tasks because it will make the system complex. So instead of exploring more techniques or making the system complex it is more reasonable to develop the classifier by using existing techniques and then combine them by using different combination techniques to enhance the performance of the system. Thus, this paper describes the design and implementation of a VQHMM based Multiple Classifier System by using different combination techniques. The results show that the developed system by using confusion matrix significantly improve the identification rate.
more …
By
RechyRamírez, Fernando; Mesa, HéctorGabriel Acosta; MezuraMontes, Efrén; CruzRamírez, Nicandro
Show all (4)
Post to Citeulike
3 Citations
In this work, we present a novel algorithm for time series discretization. Our approach includes the optimization of the word size and the alphabet as one parameter. Using evolutionary programming, the search for a good discretization scheme is guided by a cost function which considers three criteria: the entropy regarding the classification, the complexity measured as the number of different strings needed to represent the complete data set, and the compression rate assessed as the length of the discrete representation. Our proposal is compared with some of the most representative algorithms found in the specialized literature, tested in a wellknown benchmark of time series data sets. The statistical analysis of the classification accuracy shows that the overall performance of our algorithm is highly competitive.
more …
By
SalinasGutiérrez, Rogelio; MuñozZavala, Ángel Eduardo; GuerreroDíaz de León, José Antonio; HernándezAguirre, Arturo
Show all (4)
Post to Citeulike
This work presents a metaheuristic based on the use of the beta distribution as a search distribution for solving numerical optimization problems in search spaces defined on two sided intervals. The innovation of this work lies on the efficiency of the proposed method to estimate the parameters of the beta distribution with a minimal cost for each decision variable by using the method of moments. The numerical experiments provided evidence that applying the method of moments for parameter estimation and the beta distribution as a search distribution generates competitive results.
more …
By
LoyolaGonzález, Octavio; Monroy, Raúl; MedinaPérez, Miguel Angel; Cervantes, Bárbara; GrimaldoTijerina, José Ernesto
Show all (5)
Post to Citeulike
Nowadays, companies invest resources in detecting nonhuman accesses on their web traffics. Usually, nonhuman accesses are a few compared with the human accesses, which is considered as a class imbalance problem, and as a consequence, classifiers bias their classification results toward the human accesses obviating, in this way, the nonhuman accesses. In some classification problems, such as the nonhuman traffic detection, high accuracy is not only the desired quality, the model provided by the classifier should be understood by experts. For that, in this paper, we study the use of contrast patternbased classifiers for building an understandable and accurate model for detecting nonhuman traffic on web log files. Our experiments over five databases show that the contrast patternbased approach obtains significantly better AUC results than other stateoftheart classifiers.
more …
By
Kolesnikova, Olga; Gelbukh, Alexander
Post to Citeulike
The meaning of such verbnoun combinations as take care, undertake work, pay attention can be generalized as DO what is designated by the noun. Likewise, the meaning of make a decision, provide support, write a letter can be generalized as MAKE what is designated by the noun. These generalizations represent the meaning of certain groups of verbnoun combinations. We use supervised machine learning algorithms to predict the meanings DO, MAKE, BEGIN, and CONTINUE of previously unseen verbnoun pairs. We evaluate the performance of the applied algorithms on a training set using 10 fold crossvalidation technique. The learnt models have also been evaluated on an independent test set and the predictions have been checked manually to determine the accuracy of the classifiers. The obtained results show that supervised machine learning methods achieve significant accuracy and can be used for semantic annotation of verbnoun combinations.
more …
By
MejíaLavalle, Manuel; Victorio, Hermilo; Martínez, Alicia; Sidorov, Grigori; Sucar, Enrique; PichardoLagunas, Obdulia
Show all (6)
Post to Citeulike
Good pedagogical actions are key components in all learningteaching schemes. Automate that is an important Intelligent Tutoring Systems objective. We propose apply Partially Observable Markov Decision Process (POMDP) in order to obtain automatic and optimal pedagogical recommended action patterns in benefit of human students, in the context of Intelligent Tutoring System. To achieve that goal, we need previously create an efficient POMDP solver framework with the ability to work with real world tutoring cases. At present time, there are several Web available POMDP open tool solvers, but their capacity is limited, as experiments showed in this paper exhibit. In this work, we describe and discuss several design ideas toward obtain an efficient POMDP solver, useful in our problem domain.
more …
By
CamarenaIbarrola, Antonio; Chávez, Edgar
Post to Citeulike
1 Citations
Real time tracking of musical performances allows for implementation of virtual teachers of musical instruments, automatic accompanying of musicians or singers, and automatic adding of special effects in live presentations.
State of the art approaches make a local alignment of the score (the target audio) and a musical performance, such procedure induce cumulative error since it assumes the rendition to be well tracked up to the current time. We propose searching for the knearest neighbors of the current audio segment among all audio segments of the score then use some heuristics to decide the current tracked position of the performance inside the score.
We tested the method with 62 songs, some pop music but mostly classical. For each song we have two performances, we use one of them as the score and the other one as the music to be tracked with excellent results.
more …
By
Kreinovich, Vladik; Jacob, Christelle; Dubois, Didier; Cardoso, Janette; Ceberio, Martine; Batyrshin, Ildar
Show all (6)
Post to Citeulike
In many reallife applications (e.g., in aircraft maintenance), we need to estimate the probability of failure of a complex system (such as an aircraft as a whole or one of its subsystems). Complex systems are usually built with redundancy allowing them to withstand the failure of a small number of components. In this paper, we assume that we know the structure of the system, and, as a result, for each possible set of failed components, we can tell whether this set will lead to a system failure. For each component A, we know the probability P(A) of its failure with some uncertainty: e.g., we know the lower and upper bounds
$\underline P(A)$
and
$\overline P(A)$
for this probability. Usually, it is assumed that failures of different components are independent events. Our objective is to use all this information to estimate the probability of failure of the entire the complex system. In this paper, we describe a new efficient method for such estimation based on Cauchy deviates.
more …
By
Carbajal Hernández, José Juan; Sánchez Fernández, Luis Pastor
Post to Citeulike
Actually associative memories have demonstrated to be useful in pattern processing field. Hopfield model is an autoassociative memory that has problems in the recalling phase; one of them is the time of convergence or non convergence in certain cases with patterns bad recovered. In this paper, a new algorithm for the Hopfield associative memory eliminates iteration processes reducing time computing and uncertainty on pattern recalling. This algorithm is implemented using a corrective vector which is computed using the Hopfield memory. The corrective vector adjusts misclassifications in output recalled patterns. Results show a good performance of the proposed algorithm, providing an alternative tool for the pattern recognition field.
more …
By
Madrigal, Francisco; Rivera, Mariano; Hayet, JeanBernard
Post to Citeulike
This paper presents a particle filterbased approach for multiple target tracking in video streams in single static cameras settings. We aim in particular to manage middense crowds situations, where, although tracking is possible, it is made complicated by the presence of frequent occlusions among targets and with scene clutter. Moreover, the appearance of targets is sometimes very similar, which makes standard trackers often switch their target identity. Our contribution is twofold: (1) we first propose an estimation scheme for motion priors in the camera field of view, that integrates sparse optical flow data and regularizes the corresponding discrete distribution fields on velocity directions and magnitudes; (2) we use these motion priors in a hybrid motion model for a particle filter tracker. Through several results on videosurveillance datasets, we show the pertinence of this approach.
more …
By
DíazPacheco, Angel; GonzalezBernal, Jesús A.; ReyesGarcía, Carlos Alberto; EscalanteBalderas, Hugo Jair
Show all (4)
Post to Citeulike
The increasingly larger quantities of information generated in the world over the last few years, has led to the emergence of the paradigm known as Big Data. The analysis of those vast quantities of data has become an important task in science and business in order to turn that information into a valuable asset. Many data analysis tasks involves the use of machine learning techniques during the model creation step and the goal of these predictive models consists on achieving the highest possible accuracy to predict new samples, and for this reason there is high interest in selecting the most suitable algorithm for a specific dataset. This trend is known as model selection and it has been widely studied in datasets of common size, but poorly explored in the Big Data context. As an effort to explore in this direction this work propose an algorithm for model selection in Big Data.
more …
By
Dash, Sandeep Kumar; Pakray, Partha; Gelbukh, Alexander
Post to Citeulike
We describe a framework for virtualizing the documented physiotherapy instructions. This paper tries to bridge the gap between human understanding and the written manuals of instructions for physiotherapy through a pipeline of language processing techniques. As mapping of text to action needs accurate synchronization between the sequence of commands and generation of action, a structure has been developed that reflects the modeling techniques followed by some of the important action rendering systems. The idea is to put the semantic information into the proposed structure and add the implicit knowledge related to the domain. It eases the process of manual mapping of wearable sensor data of human body movements to rather a simple analysis of textual instructions. The Natural Language Processing pipeline will involve among others some of the main concepts as semantic and spatial information processing as these carries vital importance in this approach.
more …
By
RamosArreguin, JuanManuel; GuillenGarcia, Emmanuel; CancholaMagdaleno, Sandra; PedrazaOrtega, JesusCarlos; GorrostietaHurtado, Efren; AcevesFernández, MarcoAntonio; RamosArreguin, CarlosAlberto
Show all (7)
Post to Citeulike
3 Citations
The pneumatic actuators can be a useful way to control the position of a manipulator robot, instead of an electrical actuator. The major problem with pneumatic actuators is the compressibility of the air, due to the fact that the mathematical model is a system formed by a set of highly nonlinear equations then a simple PID control is not enough to control the robot position, and fuzzy logic is a good option. This work is focused on the hardware implementation of Fuzzy logic algorithm into FPGA system. This paper also presents a methodology to implement a pneumatic control using fuzzy logic, into a FPGA device, which is the main contribution of this work. The air flow is controlled with a pulse width modulation, applied to the pneumatic electro valve, with 25 ms of period.
more …
By
Razo Gil, Laura Jeanine; GodoyCalderón, Salvador; Barrón Fernández, Ricardo
Post to Citeulike
This paper contains the theoretical mechanisms of the techniques for nonconvex shape recognition, through the use of contour chains and differentiated weighting scheme on them. As an application example, we use a set of digital images that represent the various symbols contained in the dactylogical alphabet. First we introduce the reader to the many preprocessing and segmentation techniques applied to the set of images. Later on, we describe the use of direction codes to code the symbols’ contour. Finally, a novel differentiated weighting scheme is incorporated to an ALVOTtype algorithm, and then used for the supervised classification (identification) of the various symbols within the image set. The proposed methodology is then evaluated and contrasted through a series of experiments.
more …
By
KuriMorales, Angel; CartasAyala, Alejandro
Post to Citeulike
One of the most interesting goals in engineering and the sciences is the mathematical representation of physical, social and other kind of complex phenomena. This goal has been attempted and, lately, achieved with different machine learning (ML) tools. ML owes much of its present appeal to the fact that it allows to model complex phenomena without the explicit definition of the form of the model. Neural networks and support vector machines exemplify such methods. However, in most of the cases, these methods yield “black box” models, i.e. input and output correspond to the phenomena under scrutiny but it is very difficult (or outright impossible) to discern the interrelation of the input variables involved. In this paper we address this problem with the explicit aim of targeting on models which are closed in nature, i.e. the aforementioned relation between variables is explicit. In order to do this, in general, the only assumption regarding the data is that they be approximately continuous. In such cases it is possible to represent the system with polynomial expressions. To be able to do so one must define the number of monomials, the degree of every variable in every monomial and the coefficients associated. We model sparse data systems with an algorithm minimizing the minmax norm. From mathematical and experimental evidence we are able to set a bound on the number of terms and degrees of the approximating polynomials. Thereafter, a genetic algorithm (GA) identifies the coefficients which correspond to the terms and degrees defined as above.
more …
By
GalvánLópez, Edgar; VázquezMendoza, Lucia; Trujillo, Leonardo
Post to Citeulike
1 Citations
Data sets with imbalanced class distribution pose serious challenges to wellestablished classifiers. In this work, we propose a stochastic multiobjective genetic programming based on semantics. We tested this approach on imbalanced binary classification data sets, where the proposed approach is able to achieve, in some cases, higher recall, precision and Fmeasure values on the minority class compared to C4.5, Naive Bayes and Support Vector Machine, without significantly decreasing these values on the majority class.
more …
By
López Chau, Asdrúbal; Li, Xiaoou; Yu, Wen; Cervantes, Jair; MejíaÁlvarez, Pedro
Show all (5)
Post to Citeulike
2 Citations
Border points are those instances located at the outer margin of dense clusters of samples. The detection is important in many areas such as data mining, image processing, robotics, geographic information systems and pattern recognition. In this paper we propose a novel method to detect border samples. The proposed method makes use of a discretization and works on partitions of the set of points. Then the border samples are detected by applying an algorithm similar to the presented in reference [8] on the sides of convex hulls. We apply the novel algorithm on classification task of data mining; experimental results show the effectiveness of our method.
more …
By
HernandezBelmonte, Uriel H.; AyalaRamirez, Victor; SanchezYanez, Raul E.
Post to Citeulike
2 Citations
In this paper, we show a comparative review of Connected Component Labeling (CCL) methods, focused in twopass variants, including their elements and implementation issues. We analyze the main elements used by these CCL algorithms and their importance for the performance of the methods using them. We present some experiments using a complex image set and evaluating the performance of each algorithm under analysis.
more …
By
Segovia Domínguez, Ignacio; Hernández Aguirre, Arturo; Villa Diharce, Enrique
Post to Citeulike
This paper introduces the Gaussian polytree estimation of distribution algorithm, a new construction method, and its application to estimation of distribution algorithms in continuous variables. The variables are assumed to be Gaussian. The construction of the tree and the edges orientation algorithm are based on information theoretic concepts such as mutual information and conditional mutual information. The proposed Gaussian polytree estimation of distribution algorithm is applied to a set of benchmark functions. The experimental results show that the approach is robust, comparisons are provided.
more …
By
DíazPacheco, Angel; ReyesGarcía, Carlos Alberto
Post to Citeulike
Full Model Selection is a technique for improving the accuracy of machine learning algorithms through the search of the most adequate combination on each dataset of feature selection, data preparation, a machine learning algorithm and its hyperparameters tuning. With the increasingly larger quantities of information generated in the world, the emergence of the paradigm known as Big Data has made possible the analysis of gigantic datasets in order to obtain useful information for science and business. Though Full Model Selection is a powerful tool, it has been poorly explored in the Big Data context, due to the vast search space and the elevated number of fitness evaluations of candidate models. In order to overcome this obstacle, we propose the use of proxy models in order to reduce the number of expensive fitness functions evaluations and also the use of the Full Model Selection paradigm in the construction of such proxy models.
more …
By
Ezin, Eugène C.; ReyesGalaviz, Orion Fausto; ReyesGarcía, Carlos A.
Post to Citeulike
Character classification is known to be one of many basic applications in the field of artificial neural networks (ANN), while data transmission with low size is important in the field of source coding. In this paper, we constructed an alphabet of 36 letters which are encoded with the Huffman algorithm and then classified with a backpropagation Feed Forward artificial neural network. Since an ANN is initialized with random weights, the performance is not always optimal. Therefore, we designed a simple genetic algorithm (SGA) that choses an ANN and optimizes its architecture to improve the recognition accuracy. The performance evaluation is given to show the effectiveness of the procedure used, where we reached an accuracy of 100%.
more …
By
SilvaLópez, Rafaela Blanca; HerreraAlcántara, Oscar
Post to Citeulike
In this work we present a Selection Model of Learning Activities (SMLA) in a Personalized Virtual Learning Environment. The SMLA consists of a model to evaluate and classify academic activities based on its difficulty, so the difficult or easy activities are replaced with other with intermediate difficulty
. The SMLA is part of a Strategic Learning Metamodel that is conformed by three layers: (i) the intelligent layer that includes a personalized virtual learning environment, (ii) the infrastructure layer based on the Cloud Computing, and (iii) the regulation model that provides a control of the activities assigned to the learners based on the FullBrain theory. We develop four experiments that implement the SMLA, and we observed that the model with activities with difficulty between 0.2 and 0.85 produces an enhancement int the academic performance of the students.
more …
By
LobatoRíos, Víctor; TenorioGonzalez, Ana C.; Morales, Eduardo F.
Post to Citeulike
Object recognition is a relevant task for many areas and, in particular, for service robots. Recently object recognition has been dominated by the use of Deep Neural Networks (DNN), however, they required a large number of images and long training times. If a user asks a service robot to search for an unknown object, it has to deal with selecting relevant images to learn a model, deal with polysemy, and learn a model relatively quickly to be of any use to the user. In this paper we describe an object recognition system that deals with the above challenges by: (i) a user interface to reduce different object interpretations, (ii) downloading onthefly images from Internet to train a model, and (iii) using the outputs of a trimmed pretrained DNN as attributes for a SVM. The whole process (selecting and downloading images and training a model) of learning a model for an unknown object takes around two minutes. The proposed method was tested on 72 common objects found in a house environment with very high precision and recall rates (over 90%).
more …
By
Chang, Leonardo; Duarte, Miriam Monica; Sucar, Luis Enrique; Morales, Eduardo F.
Show all (4)
Post to Citeulike
1 Citations
Several methods have been presented in the literature that successfully used SIFT features for object identification, as they are reasonably invariant to translation, rotation, scale, illumination and partial occlusion. However, they have poor performance for classification tasks. In this work, SIFT features are used to solve problems of object class recognition in images using a twostep process. In its first step, the proposed method performs clustering on the extracted features in order to characterize the appearance of classes. Then, in the classification step, it uses a three layer Bayesian network for object class recognition. Experiments show quantitatively that clusters of SIFT features are suitable to represent classes of objects. The main contributions of this paper are the introduction of a Bayesian network approach in the classification step to improve performance in an object class recognition task, and a detailed experimentation that shows robustness to changes in illumination, scale, rotation and partial occlusion.
more …
By
Guillén Galván, Carlos; Valdés Amaro, Daniel; Uriarte Adrián, Jesus
Post to Citeulike
In this paper we revisit the MumfordShah functional, one of the most studied variational approaches to image segmentation. The contribution of this work is to propose a modification of the MumfordShah functional that includes Fractal Analysis to improve the segmentation of images with fractal or semifractal objects. Here we show how the fractal dimension is calculated and embedded in the functional minimization computation to drive the algorithm to use both, changes in the image intensities and the fractal characteristics of the objects, to obtain a more suitable segmentation. Experimental results confirm that the proposed modification improves the quality of the segmentation in images with fractal objects or semi fractal such as medical images.
more …
By
RomeroRodríguez, Wendoly J. Gpe.; Zamudio Rodríguez, Victor Manuel; Baltazar Flores, Rosario; SoteloFigueroa, Marco Aurelio; Alcaraz, Jorge Alberto Soria
Show all (5)
Post to Citeulike
One of the concerns of humanity today is developing strategies for saving energy, because we need to reduce energetic costs and promote economical, political and environmental sustainability. As we have mentioned before, in recent times one of the main priorities is energy management. The goal in this project is to develop a system that will be able to find optimal configurations in energy savings through management light. In this paper a comparison between Genetic Algorithms (GA) and Bee Swarm Optimization (BSO) is made. These two strategies are focus on lights management, as the main scenario, and taking into account the activity of the users, size of area, quantity of lights, and power. It was found that the GA provides an optimal configuration (according to the user’s needs), and this result was consistent with Wilcoxon’s Test.
more …
By
Ortuño, Santiago Yip; Hernández Aguilar, José Alberto; Taboada, Blanca; Ochoa Ortiz, Carlos Alberto; Ramírez, Miguel Pérez; Arroyo Figueroa, Gustavo
Show all (6)
Post to Citeulike
We discuss the application of Artificial Intelligence for the design of intrusion detection systems (IDS) applied on computer networks. For this purpose, we use J48 rand ClonalG [5] immune artificial system Algorithms, in WEKA software, with the purpose to classify and predict intrusions in KDDCup 1999 and Kyoto 2006 databases. We obtain for the KDDCup 1999 database 92.69% for ClonalG and 99.91% of precision for J48 respectively. For the Kyoto University 2006 database, we obtain 95.2% for ClonalG and 99.25% of precision for J48. Finally, based on these results we propose a model to detect intrusions using AI techniques. The main contribution of the paper is the adaptability of the CLONALG Algorithm and the reduction of database attributes by using Genetic Search.
more …
By
MoraGutiérrez, Roman Anselmo; Ponsich, Antonin; Rincón García, Eric Alfredo; delosCobosSilva, Sergio Gerardo; Gutiérrez Andrade, Miguel Ángel; LaraVelázquez, Pedro
Show all (6)
Post to Citeulike
The constrained portfolio optimization problem with multiobjective functions cannot be efficiently solved using exact techniques. Thus, heuristics approaches seem to be the best option to find high quality solutions in a limited amount of time. For solving this problem, this paper proposes an algorithm based on the Method of Musical Composition (MMC), a metaheuristic that mimics an multiagent based creativity system associated with musical composition. In order to prove its performance, the algorithm was tested over five wellknown benchmark data sets and the obtained results prove to be highly competitive since they outperform those reported in the specialized literature in four out of the five tackled instances.
more …
By
Ibargüengoytia, Pablo H.; Delgadillo, Miguel A.; García, Uriel A.
Post to Citeulike
Several learning algorithms have been proposed to construct probabilistic models from data using the Bayesian networks mechanism. Some of them permit the participation of human experts in order to create a knowledge representation of the domain. However, multiple different models may result for the same problem using the same data set. This paper presents the experiences in the construction of a probabilistic model that conforms a viscosity virtual sensor. Several experiments have been conduced and several different models have been obtained. This paper describes the evaluation implemented of all models under different criteria. The analysis of the models and the conclusions identified are included in this paper.
more …
By
Gomez, Juan Carlos; TerashimaMarín, Hugo
Post to Citeulike
5 Citations
This article presents a method based on the multiobjective evolutionary algorithm NSGAII to approximate hyperheuristics for solving irregular 2D cutting stock problems under multiple objectives. In this case, additionally to the traditional objective of minimizing the number of sheets used to fit a finite number of irregular pieces, the time required to perform the placement task is also minimized, leading to a biobjective minimization problem with a tradeoff between the number of sheets and the time required for placing all pieces. We solve this problem using multiobjective hyperheuristics (MOHHs), whose main idea consists of finding a set of simple heuristics which can be combined to find a general solution for a wide range of problems, where a single heuristic is applied depending on the current condition of the problem, instead of applying a unique single heuristic during the whole placement process. The MOHHs are approximated after going through a learning process by mean of the NSGAII, which evolves combinations of conditionaction rules producing at the end a set of Paretooptimal MOHHs. We tested the approximated MMOHHs on several sets of benchmark problems, having outstanding results for most of the cases.
more …
By
Flores, Juan J.; López, Rodrigo; Barrera, Julio
Post to Citeulike
2 Citations
Evolutionary computation is inspired by nature in order to formulate metaheuristics capable to optimize several kinds of problems. A family of algorithms has emerged based on this idea; e.g. genetic algorithms, evolutionary strategies, particle swarm optimization (PSO), ant colony optimization (ACO), etc. In this paper we show a populationbased metaheuristic inspired on the gravitational forces produced by the interaction of the masses of a set of bodies. We explored the physics knowledge in order to find useful analogies to design an optimization metaheuristic. The proposed algorithm is capable to find the optima of unimodal and multimodal functions commonly used to benchmark evolutionary algorithms. We show that the proposed algorithm works and outperforms PSO with niches in both cases. Our algorithm does not depend on a radius parameter and does not need to use niches to solve multimodal problems. We compare with other metaheuristics respect to the mean number of evaluations needed to find the optima.
more …
By
MarinCastro, Heidy M.; SosaSosa, Victor J.; LopezArevalo, Ivan
Post to Citeulike
The amount of information contained in databases in the Web has grown explosively in the last years. This information, known as the Deep Web, is dynamically obtained from specific queries to these databases through Web Query Interfaces (WQIs). The problem of finding and accessing databases in the Web is a great challenge due to the Web sites are very dynamic and the information existing is heterogeneous. Therefore, it is necessary to create efficient mechanisms to access, extract and integrate information contained in databases in the Web. Since WQIs are the only means to access databases in the Web, the automatic identification of WQIs plays an important role facilitating traditional search engines to increase the coverage and access interesting information not available on the indexable Web. In this paper we present a strategy for automatic identification of WQIs using supervised learning and making an adequate selection and extraction of HTML elements in the WQIs to form the training set. We present two experimental tests over a corpora of HTML forms considering positive and negative examples. Our proposed strategy achieves better accuracy than previous works reported in the literature.
more …
By
VivarEstudillo, Guillermina; IbarraManzano, MarioAlberto; AlmanzaOjeda, DoraLuz
Post to Citeulike
Tremor is an involuntary rhythmic movement observed in people with Parkinson’s disease (PD), specifically, hand tremor is a measurement for diagnosing this disease. In this paper, we use hand positions acquired by Leap Motion device for statistical analysis of hand tremor based on the sum and difference of histograms (SDH). Tremor is measured using only one coordinate of the center palm during predefined exercises performed by volunteers at Hospital. In addition, the statistical features obtained with SDH are used to classify tremor signal as with PD or not. Experimental results show that the classification is independent of the hand used during tests, achieving
$$98\%$$
of accuracy for our proposed approach using different supervised machine learning classifiers. Additionally, we compare our result with others classifiers proposed in the literature.
more …
By
GarzaCuéllar, Alejandro; ValenzuelaRendón, Manuel; ParraÁlvarez, RicardoJavier
Post to Citeulike
Wilson’s XCS represents and stores the knowledge it has acquired from an environment as a set of classifiers. In the XCS, don’t cares (#) may be used in the conditions of classifiers to express generalization. This paper is focused on the representation of knowledge with the minimal number of classifiers. For this purpose, a new process called fusion is implemented. Fusion promotes the emergence of more generalized yet accurate classifiers and the reduction of the number of macroclassifiers. Furthermore, to get even more compact rules sets, the implementation of the # symbol in the action of the classifiers is proposed; this allows generalization when possible, and the existence noncompeting classifiers in the population if a state has multiple equally correct actions that can be performed. The proposed modified generalized extended XCS (gXCS) was compared with the XCS on the Woods2 environment and a modification of this environment, modifiedWoods2, that has locations where there are multiple equally good actions. The performances of XCS and gXCS are very similar; yet, gXCS obtains more parsimonious rule sets. Furthermore, gXCS can find good rule sets even when the probability of # is set zero, contrary to the XCS.
more …
By
Martínez, Luis G.; RodríguezDíaz, Antonio; Licea, Guillermo; Castro, Juan R.
Show all (4)
Post to Citeulike
4 Citations
This paper proposes an ANFIS (Adaptive Network Based Fuzzy Inference System) Learning Approach where we have found patterns of personality types using Big Five Personality Tests for Software Engineering Roles in Software Development Project Teams as part of RAMSET (Role Assignment Methodology for Software Engineering Teams) methodology. An ANFIS model is applied to a set of role traits resulting from Big Five personality tests in our case studies obtaining a TakagiSugenoKang (TSK) Fuzzy Inference System (FIS) type model with rules that helps us recommend best suited roles for performing in software engineering teams.
more …
By
KuriMorales, Angel Fernando; LópezPeña, Ignacio
Post to Citeulike
1 Citations
The latest AI techniques are usually computer intensive, as opposed to the traditional ones which rely on the consistency of the logic principles on which they are based. In contrast, many algorithms of Computational Intelligence (CI) are metaheuristic, i.e. methods where the particular selection of parameters defines the details and characteristics of the heuristic proper. In this paper we discuss a method which allows us to ascertain, with high statistical significance, the relative performance of several metaheuristics. To achieve our goal we must find a statistical goodnessoffit (gof) test which allows us to determine the moment when the sample becomes normal. Most statistical gof tests are designed to reject the null hypothesis (i.e. the samples do NOT fit the same distribution). In this case we wish to determine the case where the sample IS normal. Using a Monte Carlo simulation we are able to find a practical gof test to this effect. We discuss the methodology and describe its application to the analysis of three case studies: training of neural networks, genetic algorithms and unsupervised clustering.
more …
By
Markov, Ilia; GómezAdorno, Helena; PosadasDurán, JuanPablo; Sidorov, Grigori; Gelbukh, Alexander
Show all (5)
Post to Citeulike
To determine author demographics of texts in social media such as Twitter, blogs, and reviews, we use doc2vec document embeddings to train a logistic regression classifier. We experimented with age and gender identification on the PAN author profiling 2014–2016 corpora under both single and crossgenre conditions. We show that under certain settings the neural networkbased features outperform the traditional features when using the same classifier. Our method outperforms existing state of the art under some settings, though the current stateoftheart results on those tasks have been quite weak.
more …
By
Ponce, Hiram
Post to Citeulike
1 Citations
Reinforcement learning aims to solve the problem of learning optimal or nearoptimal decisionmaking policies for a given domain problem. However, it is known that increasing the dimensionality of the input space (i.e. environment) will increase the complexity for the learning algorithms, falling into the curse of dimensionality. Value function approximation and hierarchical reinforcement learning have been two different approaches proposed to alleviate reinforcement learning from this illness. In that sense, this paper proposes a new value function approximation using artificial hydrocarbon networks –a supervised learning method inspired on chemical carbon networks– with regularization at each subtask in a hierarchical reinforcement learning framework. Comparative results using a greedy sparse value function approximation over the MAXQ hierarchical method was computed, proving that artificial hydrocarbon networks improves accuracy and efficiency on the value function approximation.
more …
By
Valdez, Fevrier; Melin, Patricia; Castillo, Oscar
Post to Citeulike
This paper describes a hybrid approach for optimization combining Particle Swarm Optimization (PSO) and Genetic Algorithms (GAs) using Fuzzy Logic to integrate the results, the proposed method is called FPSO+FGA. The new hybrid FPSO+FGA approach is compared with the Simulated Annealing (SA), PSO, GA, Pattern Search (PS) methods with a set of benchmark mathematical functions.
more …
By
GardunoRamirez, Raul; Borunda, Mónica
Post to Citeulike
Lately, great efforts have been made to develop effective hybrid power systems, which consist of a mixture of renewable and conventional power plants, energy storage systems and power consumers. The very dissimilar characteristics of these elements, as well as the ever increasing performance requirements imposed to them, makes the design of control systems for power generation plants a remarkably challenging task. A promising approach to provide effective solutions to this problem is by applying the paradigms of Intelligent Agents and MultiAgent Systems. In this paper, the definition of an Intelligent MultiAgent System for Supervision and Control (iMASSC) is proposed to create intelligent power plants for either renewable or conventional power generation units. A MultiAgent System with a generic structure is used instead of a single specific Intelligent Agent. This approach is more realistic in that it takes into account the complexity of current power plants. Later, the community of intelligent power plants, through autonomous and coherent collaboration, will achieve the objectives of the hybrid power system. Hence, the iMASSC model is expected to provide feasible solutions to the operation of modern intelligent hybrid power systems and smart grids.
more …
By
MartinezSoto, Ricardo; Castillo, Oscar; Aguilar, Luis T.; Melin, Patricia
Show all (4)
Post to Citeulike
10 Citations
In this paper we apply to Bioinspired and evolutionary optimization methods to design fuzzy logic controllers (FLC) to minimize the steady state error of linear systems. We test the optimal FLC obtained by the genetic algorithms and the PSO applied on linear systems using benchmark plants. The bioinspired and the evolutionary methods are used to find the parameters of the membership functions of the FLC to obtain the optimal controller. Simulation results are obtained with Simulink showing the feasibility of the proposed approach.
more …
By
Neme, Antonio; Lugo, Blanca; Cervera, Alejandra
Post to Citeulike
1 Citations
Writers tend to express their ideas with different styles, defined with the so called firm or stylome, which is an abstraction of the general constraints and specific combinations of words within their language they decide to follow. Although capturing this style has proven to be very difficult, some advances have been achieved. Here, we present a novel system that is trained with texts from the same author, and is able to unveil some of its features, and to apply them to detect texts not written by the same author, or, at least, not written with the previously learned features. The system is an hybrid model based in selforganizing maps and in informationtheoretic aspects. In the model, mutual information function of unknown texts are compared to the mutual information function of texts from a known author. If the distance between these two distributions exceeds a certain threshold, then the unknown text is from a different author, otherwise the authorship is the same. The decision threshold is obtained by the selforganizing map trained with the texts from the same author. We present results in authorship identification in several contexts including classic literature, journalism (political, economical, sports), and scientific divulgation.
more …
By
Campa, Carlos; Acevedo, Antonio; Acevedo, Elena
Post to Citeulike
In this work we present a new proposal to initialize the weights in a Backpropagation Neuronal Network (NN) using the coefficients from a FIR LowPass Filter to introduce a null in the radiation pattern in a sevenelement array of antennas to eliminate interferences in a radar system. A radar system needs to eliminate the directional noise in order to obtain a cleaner signal. The method used to eliminate this kind of noise (jitter) has to be adaptive because the objective is in constant movement, therefore, the adaptation time must be as fast as possible. Our work is based on the window method to reduce the secondary lobes in fixed arrays of antennas. We modify the radiation pattern by introducing a null at 45.5° which corresponds to the secondary lobe where the interference is presented. This is achieved when we create windows from several FIR LowPass Filters. The coefficients of these filters are used to initialize the weight vectors of a Backpropagation Neural Network which performs the adaptive process to obtain the final parameters to achieve the noise elimination. For testing our proposal we calculate the Mean Square Error (MSE), the Signal Noise Relation (SNR) and we graphed the Radiation Pattern. In addition we calculated the Cross Correlation Index in each iteration, between the desired signal and our results. With this method we reduced the number of iterations required by the process.
more …
By
GonzalezHernandez, Loreto; TorresJimenez, Jose
Post to Citeulike
Software systems have been increasingly used by our society, so a failure in them can lead to large losses. To reduce the failures of a software it is necessary to carry out the testing process appropriately. The combinatorial testing helps in the testing process by providing structures with a test set of small size, like Mixed Covering Arrays (MCAs). However, the problem of constructing an optimal test set is an NPcomplete problem leading to the development of non exhaustive approaches to solve it. This paper proposes a new approach of Tabu Search (TS) called MiTS (that stands for Mixed Tabu Search) which focuses on constructing MCAs. The approach is based on the use of a mixture of neighborhood functions and a fine tuning process to improve the performance of the TS. Experimental evidence shows a poor performance when a single neighborhood function is used. In the other side, the TS (using a mixture of neighborhood functions) is competitive in the construction of MCAs over a known benchmark reported in the literature.
more …
By
KuriMorales, Angel; TrejoBaños, Daniel; CortesBerrueco, Luis Enrique
Post to Citeulike
The problem of finding clusters in arbitrary sets of data has been attempted using different approaches. In most cases, the use of metrics in order to determine the adequateness of the said clusters is assumed. That is, the criteria yielding a measure of quality of the clusters depends on the distance between the elements of each cluster. Typically, one considers a cluster to be adequately characterized if the elements within a cluster are close to one another while, simultaneously, they appear to be far from those of different clusters. This intuitive approach fails if the variables of the elements of a cluster are not amenable to distance measurements, i.e., if the vectors of such elements cannot be quantified. This case arises frequently in real world applications where several variables (if not most of them) correspond to categories. The usual tendency is to assign arbitrary numbers to every category: to encode the categories. This, however, may result in spurious patterns: relationships between the variables which are not really there at the offset. It is evident that there is no truly valid assignment which may ensure a universally valid numerical value to this kind of variables. But there is a strategy which guarantees that the encoding will, in general, not bias the results. In this paper we explore such strategy. We discuss the theoretical foundations of our approach and prove that this is the best strategy in terms of the statistical behavior of the sampled data. We also show that, when applied to a complex real world problem, it allows us to generalize soft computing methods to find the number and characteristics of a set of clusters. We contrast the characteristics of the clusters gotten from the automated method with those of the experts.
more …
By
Garza Villarreal, Sara Elena; Brena, Ramón F.
Post to Citeulike
1 Citations
This paper introduces an approach for discovering thematically related document groups (a topic mining task) in massive document collections with the aid of graph local clustering. This can be achieved by viewing a document collection as a directed graph where vertices represent documents and arcs represent connections among these (e.g. hyperlinks). Because a document is likely to have more connections to documents of the same theme, we have assumed that topics have the structure of a graph cluster, i.e. a group of vertices with more arcs to the inside of the group and fewer arcs to the outside of it. So, topics could be discovered by clustering the document graph; we use a local approach to cope with scalability. We also extract properties (keywords and most representative documents) from clusters to provide a summary of the topic. This approach was tested over the Wikipedia collection and we observed that the resulting clusters in fact correspond to topics, which shows that topic mining can be treated as a graph clustering problem.
more …
By
Cervantes, Leticia; Castillo, Oscar; Melin, Patricia
Post to Citeulike
8 Citations
In this paper we present simulation results that we have at this moment with a new approach for intelligent control of nonlinear dynamical plants. First we present the proposed approach for intelligent control using a hierarchical modular architecture with type2 fuzzy logic used for combining the outputs of the modules. Then, the approach is illustrated with two cases: aircraft control and shower control and in each problem we explain its behavior. Simulation results of the two case show that proposed approach has potential in solving complex control problems.
more …
By
Reyes, Alberto; Ibargüengoytia, Pablo H.; Jijón, J. Diego; Guerrero, Tania; García, Uriel A.; Borunda, Mónica
Show all (6)
Post to Citeulike
Forecasting represents a very important task for planning, control and decision making in many fields. Forecasting the dollar price is important for global companies to plan their investments. Forecasting the weather is determinant to make the decision of either giving a party outdoors or indoors. Forecasting the behaviour of a process represents the key factor in predictive control. In this paper, we present a methodology to build wind power forecasting models from data using a combination of artificial intelligence techniques such as artificial neural networks and dynamic Bayesian nets. These techniques allow obtaining forecast models with different characteristics. Finally, a model recalibration function is applied to raw discrete models in order to gain an extra accuracy. The experiments ran for the unit 1 of the Villonaco wind farm in Ecuador demonstrated that the selection of the best predictor can be more useful than selecting a single highefficiency approach.
more …
By
Herrera Alcántara, Oscar; González Mendoza, Miguel
Post to Citeulike
In this work we review the parameterization of filter coefficients of compactly supported orthogonal wavelets used to implement the discrete wavelet transform. We also present the design of wavelet based filters as a constrained optimization problem where a genetic algorithm can be used to improve the compression ratio on gray scale images by minimizing their entropy and we develop a quasiperfect reconstruction scheme for images. Our experimental results report a significant improvement over previous works and they motivate us to explore other kinds of perfect reconstruction filters based on parameterized tight frames.
more …
By
Jiménez, Ricardo Benítez; Morales, Eduardo F.; Escalante, Hugo Jair
Post to Citeulike
Multilabel classification task has many applications in Text Categorization, Multimedia, Biology, Chemical data analysis and Social Network Mining, among others. Different approaches have been developed: Binary Relevance (BR), Label Power Set (LPS), Random k label sets (RAkEL), some of them consider the interaction between labels in a chain (Chain Classifier) and other alternatives around this method are derived, for instance, Probabilistic Chain Classifier, Monte Carlo Chain Classifier and Bayesian Chain Classifier (BCC). All previous approaches have in common and focus on is in considering different orders or combinations of the way the labels have to be predicted. Given that feature selection has proved to be important in classification tasks, reducing the dimensionality of the problem and even improving classification model’s accuracy. In this work a feature selection technique is tested in BCC algorithm with two searching methods, one using Best First (BFFSBCC) and another with GreedyStepwise (GSFSBCC), these methods are compared, the winner is also compared with BCC, both tests are compared through Wilcoxon Signed Rank test, in addition it is compared with others Chain Classifier and finally it is compared with others approaches (BR, RAkEL, LPS).
more …
By
Calderon, Felix; Júnez–Ferreira, Carlos A.
Post to Citeulike
Image denoising by minimizing a similarity of neighborhoodbased cost function is presented. This cost function consists of two parts, one related to data fidelity and the other is a structure preserving smoothing term. The latter is controlled by a weight coefficient that measures the neighborhood similarity between two pixels and attaching an additional term penalizes it. Unlike most work in noise removal area, the weight of each pixel within the neighborhood is not defined by a Gaussian function. The obtained results show a good performance of our proposal, compared with some stateoftheart algorithms.
more …
By
BarrónEstrada, María Lucía; ZatarainCabada, Ramón; LindorValdez, Mario
Post to Citeulike
1 Citations
We have developed a novel authoring tool for a programming learning environment that incorporates Gamification as a means of motivation named CodeTrainig. This tool focuses on improving students’ programming skills, and it offers authors not only authorship of resources but also of Gamification associated with them. An author can create courses composed by several resources. Resources are formed, in its finer grain, by programming exercises which have a description of the problem to be solved, a set of test cases, and game elements. Students can participate on courses by solving its programming exercises. As they solve exercises they earn points and rise in a leader board. Moreover, the environment let students enable or disable game components, since some of them might dislike the competitive nature of Gamification. We present some experiments we have made with the authoring tool.
more …
By
Hidalgo, Denisse; Melin, Patricia; Castillo, Oscar
Post to Citeulike
1 Citations
In this paper we describe a method for the optimization of type2 fuzzy systems based on the level of uncertainty considering three different cases to reduce the complexity problem of searching the solution space. The proposed method produces the best fuzzy inference systems for particular applications based on a genetic algorithm. We apply a Genetic Algorithm to find the optimal type2 fuzzy system dividing the search space in three subspaces. We show the comparative results obtained for the benchmark problems.
more …
By
RíosMercado, Roger Z.; SalazarAcosta, Juan C.
Post to Citeulike
5 Citations
This paper addresses a commercial districting problem arising in the bottled beverage distribution industry. The problem consists of grouping a set of city blocks into territories so as to maximize territory compactness. As planning requirements, the grouping seeks to balance both number of customers and product demand across territories, maintain connectivity of territories, and limit the total cost of routing. A combinatorial optimization model for this problem is introduced. Work on commercial territory design has particularly focused on design decisions. This work is, to the best of our knowledge, the first to address both design and routing decisions simultaneously by considering a budget constraint on the total routing cost in commercial territory design. A greedy randomized adaptive search procedure that incorporates advanced features such as adaptive memory and strategic oscillation is developed. Empirical evidence over a wide set of randomly generated instances based on realworld data show a very positive impact of these advanced components. Solution quality is significantly improved as well.
more …
By
LuisPérez, Felix Emilio; CruzBarbosa, Raúl; ÁlvarezOlguin, Gabriela
Post to Citeulike
Regionalization methods can help to transfer information from gauged catchments to ungauged river basins. Finding homogeneous regions is crucial for regional flood frequency estimation at ungauged sites. As it is the case for the Mexican Mixteca region site, where actually only one gauging station is working at present. One way of delineate these homogeneous watersheds into natural groups is by clustering techniques. In this paper, two different clustering approaches are used and compared for the delineation of homogeneous regions. The first one is the hierarchical clustering approach, which is widely used for regionalization studies. The second one is the Fuzzy CMeans technique which allow a station belong, at different grades, to several regions. The optimal number of regions is based on fuzzy cluster validation measures. The experimental results of both approaches are similar which confirm the delineated homogeneous region for this case study. Finally, the stepwise regression model using the forward selection approach is applied for the flood frequency estimation in each found homogeneous region.
more …
By
SuárezCansino, Joel; FrancoÁrcega, Anilú; FloresFlores, Linda Gladiola; LópezMorales, Virgilio; Gabbasov, Ruslan
Show all (5)
Post to Citeulike
In addition to the usual tests for analyzing the performance of a decision tree in a classification process, the analysis of the amount of time and the space resource required are also useful during the supervised decision tree induction. The parallel algorithm called “Parallel Decision Tree for Large Datasets” (or ParDTLT for short) has proved to perform very well when large datasets become part of the training and classification process. The training phase processes in parallel the expansion of a node, considering only a subset of the whole set of training objects. The time complexity analysis proves a linear dependency on the cardinality of the complete set of training objects, and that the dependence is asymptotic and log–linear on the cardinality of the selected subset of training objects when categoric and numeric data are applied, respectively.
more …
By
SalinasGutiérrez, Rogelio; HernándezAguirre, Arturo; RiveraMeraz, Mariano J. J.; VillaDiharce, Enrique R.
Show all (4)
Post to Citeulike
3 Citations
This paper introduces copula functions and the use of the Gaussian copula function to model probabilistic dependencies in supervised classification tasks. A copula is a distribution function with the implicit capacity to model non linear dependencies via concordance measures, such as Kendall’s τ. Hence, this work studies the performance of a simple probabilistic classifier based on the Gaussian copula function. Without additional preprocessing of the source data, a supervised pixel classifier is tested with a 50images benchmark; the experiments show this simple classifier has an excellent performance.
more …
By
Hernández, Yasmín; CervantesSalgado, Marilú; PérezRamírez, Miguel; MejíaLavalle, Manuel
Show all (4)
Post to Citeulike
1 Citations
The student model is a key component of intelligent tutoring systems since enables them to respond to particular needs of students. In the last years, educational systems have widespread in school and industry and they produce data which can be used to know students and to understand and improve the learning process. The student modeling has been improved thanks to educational data mining, which is concerned with discovering novel and potentially useful information from large volumes of data. To build a student model, we have used the data log of a virtual reality training system that has been used for several years to train electricians. We compared the results of this student model with a student model built by an expert. We rely on Bayesian networks to represent the student models. Here we present the student models and the results of an initial evaluation.
more …
By
Figueroa, Fernando David Ramirez; Caeiros, Alfredo Victor Mantilla
Post to Citeulike
This paper presents a novel hybrid adaptive fuzzy controller for the regulation of speed on induction machines with direct torque control. The controller is based on a fuzzy system and PID control with decoupled gains. Genetic programming techniques are used for offline optimizations of the normalization constants of fuzzy membership function ranges. Fuzzy cluster means is introduced for online optimization on the limits of triangular fuzzy membership functions. Finally simulations in LabVIEW are presented validating the response of the controller with and without load on the machine; results and conclusions are discussed.
more …
By
GómezHerrera, Fernando; RamirezValenzuela, Rodolfo A.; OrtizBayliss, José Carlos; Amaya, Ivan; TerashimaMarín, Hugo
Show all (5)
Post to Citeulike
This research describes three novel heuristicbased approaches for solving the 0/1 knapsack problem. The knapsack problem, in its many variants, arises in many practical scenarios such as the selection of investment projects and budget control. As an NPhard problem, it is not always possible to compute the optimal solution by using exact methods and, for this reason, the problem is usually solved by using heuristicbased strategies. In this document, we use information of the distributions of weight and profit of the items in the knapsack instances to design and implement new heuristicbased methods that solve those instances. The solution model proposed in this work is twofold: the first part focuses on the generation of two new heuristics, while the second explores the combination of solving methods through a hyperheuristic approach. The heuristics proposed, as well as the hyperheuristic model, were tested on a heterogeneous set of knapsack problem instances and compared against four heuristics taken from the literature. One of the proposed heuristics proved to be highly competent with respect to heuristics available in the literature. By using the hyperheuristic, a solver that dynamically selects heuristics based on the problem features, we improved the results obtained by the new heuristics proposed and, achieved the best results among all the methods tested in this investigation.
more …
By
LuisPérez, Felix Emilio; TrujilloRomero, Felipe; MartínezVelazco, Wilebaldo
Post to Citeulike
2 Citations
This paper presents the results of our research in automatic recognition of the Mexican Sign Language (MSL) alphabet as control element for a service robot. The technique of active contours was used for image segmentation in order to recognize de signs. Once segmented, we proceeded to obtain the signature of the corresponding sign and trained a neural network for its recognition. Every symbol of the MSL was assigned to a task that the robotic system had to perform; we defined eight different tasks. The system was validated using a simulation environment and a real system. For the real case, we used a mobile platform (Powerbot) equipped with a manipulator with 6 degrees of freedom (PowerCube). For simulation of the mobile platforms, RoboWorks was used as the simulation environment. In both, simulated and real platforms, tests were performed with different images to those learned by the system, obtaining in both cases a recognition rate of 95.8%.
more …
By
MartínezVillaseñor, Lourdes; Ponce, Hiram; Marmolejo, José Antonio; Ramírez, Juan Manuel; Hernández, Agustina
Show all (5)
Post to Citeulike
In this paper, a deterministic dynamic mixedinteger programming model for solving the generation and transmission expansionplanning problem is addressed. The proposed model integrates conventional generation with renewable energy sources and it is based on a centralized planned transmission expansion. Due a growing demand over time, it is necessary to generate expansion plans that can meet the future requirements of energy systems. Nowadays, in most systems a public entity develops both the short and long of electricitygrid expansion planning and mainly deterministic methods are employed. In this study, an heuristic optimization approach based on genetic algorithms is presented. Numerical results show the performance of the proposed algorithm.
more …
By
ZatarainCabada, Ramón; BarrónEstrada, María Lucía; GonzálezHernández, Francisco; OramasBustillos, Raúl; AlorHernández, Giner; ReyesGarcía, Carlos Alberto
Show all (6)
Post to Citeulike
Studies investigating the effectiveness of affect detection inside intelligent learning environments (ILEs) have reported the effectiveness of including emotion identification on learning
. However, there is limited research on detecting and using learningcentered data to investigate metacognitive and affective monitoring with ILEs. In this work we report the methodology we follow to create a new facial expression corpus from electroencephalography information, an implementation of an algorithm and a training of an SVM to recognize learningcentered emotions (frustration, boredom, engagement and excitement). Also, we explain changes realized in a fuzzy logic system into an intelligent learning environment. The affect recognizer was tested into an ILE for learning Java programming. We present successful results of the recognizer using our corpus face database and an example test using our ILE.
more …
By
delosCobosSilva, Sergio Gerardo; Gutiérrez Andrade, Miguel Ángel; LaraVelázquez, Pedro; Rincón García, Eric Alfredo; MoraGutiérrez, Roman Anselmo; Ponsich, Antonin
Show all (6)
Post to Citeulike
Nonlinear regression is a statistical technique widely used in research which creates models that conceptualize the relation among many variables that are related in complex forms. These models are widely used in different areas such as economics, biology, finance, engineering, etc. These models are subsequently used for different processes, such as prediction, control or optimization. Many standard regression methods have been proved that produce misleading results in certain data sets; this is especially true in ordinary least squares. In this article three metaheuristic models for parameter estimation of nonlinear regression models are described: Artificial Bee Colony, Particle Swarm Optimization and a novel hybrid algorithm ABCPSO. These techniques were tested on 27 databases of the NIST collection with different degrees of difficulty. The experimental results provide evidence that the proposed algorithm finds consistently good results.
more …
By
SolorioFernández, Saúl; CarrascoOchoa, J. Ariel; MartínezTrinidad, José Fco.
Post to Citeulike
Unsupervised Feature Selection methods have raised considerable interest in the scientific community due to their capability of identifying and selecting relevant features in unlabeled data. In this paper, we evaluate and compare seven of the most widely used and outstanding ranking based unsupervised feature selection methods of the stateoftheart, which belong to the filter approach. Our study was made on 25 high dimensional realworld datasets taken from the ASU Feature Selection Repository. From our experiments, we conclude which methods perform significantly better in terms of quality of selection and runtime.
more …
By
MárquezVega, Luis A.; TorresTreviño, Luis M.
Post to Citeulike
This paper presents the Universal Swarm Optimizer for MultiObjective Functions (USO), which is inspired in the zonebased model proposed by Couzin that represents in a more realistic way the behavior of biological species as fish schools and bird flocks. The algorithm is validated using 10 multiobjective benchmark problems and a comparison with the MultiObjective Particle Swarm Optimization (MOPSO) is presented. The obtained results suggest that the proposed algorithm is very competitive and presents interesting characteristics which could be used to solve a wide range of optimization problems.
more …
By
Alelhí, RománFlores Mariana; Guillermo, SantamaríaBonfil; Lorena, DíazGonzález; Gustavo, ArroyoFigueroa
Show all (4)
Post to Citeulike
In the exploitation stage of a geothermal reservoir, the estimation of the bottomhole temperature (BHT) is essential to know the available energy potential, as well as the viability of its exploitation. This BHT estimate can be measured directly, which is very expensive, therefore, statistical models used as virtual geothermometers are preferred. Geothermometers have been widely used to infer the temperature of deep geothermal reservoirs from the analysis of fluid samples collected at the soil surface from springs and exploration wells. Our procedure is based on an extensive geochemical data base (n = 708) with measurements of BHT and geothermal fluid of eight main element compositions. Unfortunately, the geochemical database has missing data in terms of some compositions of measured principal elements. Therefore, to take advantage of all this information in the BHT estimate, a process of imputation or completion of the values is necessary.
In the present work, we compare the imputations using medium and medium statistics, as well as the stochastic regression and the support vector machine to complete our data set of geochemical components. The results showed that the regression and SVM are superior to the mean and median, especially because these methods obtained the smallest RMSE and MAE errors.
more …
By
RomeroMontiel, Flor Alejandra; RodríguezVázquez, Katya
Post to Citeulike
DNA microarrays are used for the massive quantification of gene expression. This analysis allows to diagnose, identify and classify different diseases. This is a computationally challenging task due to the large number of genes and a relatively small number of samples.
Some papers applied the generalized neuron (GN) to solve approximation functions, to calculate density estimates, prediction and classification problems [1, 2].
In this work we show how a GN can be used in the task of microarray classification. The proposed methodology is as follows: first reducing the dimensionality of the genes using a genetic algorithm, then the generalized neuron is trained using one bioinspired algorithms: Particle Swarm Optimization, Genetic Algorithm and Differential Evolution. Finally the precision of the methodology it is tested by classifying three databases of DNA microarrays:
$$Leukemia\ benchmarck$$
$$ALLAML$$
,
$$Colon\ Tumor$$
and
$$Prostate\ cancer$$
.
more …
By
Carlos, M. Ricardo; Martínez, Fernando; Cornejo, Raymundo; González, Luis C.
Show all (4)
Post to Citeulike
Given that thousands of applications are already available for smartphones, we may be inclined to believe that ubiquitous computing is just around the corner, with online processing in these mobile devices. But, how well prepared is current smartphone technology to support the execution of demanding algorithms? Surprisingly, few researchers have addressed the processing capabilities of currently available smartphones. In this paper we investigate some issues in this direction: we employed twelve algorithms for optimization and classification to profile the computational demands they place on current smartphones. For this purpose, we chose twelve devices that go from low to highend models, from six different makers, and measured execution time, CPU and RAM usage while the devices were running the algorithms.
more …
By
LópezMorales, Virgilio; SuárezCansino, Joel; Gabbasov, Ruslan; Arcega, Anilu Franco
Show all (4)
Post to Citeulike
In order to ascertain and solve a particular Multiple Criteria Decision Making (MCDM) problem, frequently a diverse group of experts must share their knowledge and expertise, and thus uncertainty arises from several sources. In those cases, the Multiplicative Preference Relation (MPR) approach can be a useful technique. An MPR is composed of judgements between any two criteria components which are declared within a crisp rank and to express decision maker(s) (DM) preferences. Consistency of an MPR is obtained when each expert has her/his information and, consequently, her/his judgments free of contradictions. Since inconsistencies may lead to incoherent results, individual Consistency should be sought after in order to make rational choices. In this paper, based on the Hadamard’s dissimilarity operator, a methodology to derive intervals for MPRs satisfying a consistency index is introduced. Our method is proposed through a combination of a numerical and a nonlinear optimization algorithms. As soon as the synthesis of an interval MPR is achieved, the DM can use these acceptably consistent intervals to express flexibility in the manner of her/his preferences, while accomplishing some a priori decision targets, rules and advice given by her/his current framework. Thus, the proposed methodology provides reliable and acceptably consistent Interval MPR, which can be quantified in terms of Row Geometric Mean Method (RGMM) or the Eigenvalue Method (EM). Finally, some examples are solved through the proposed method in order to illustrate our results and compare them with other methodologies.
more …
By
Solarte, Mario; RamírezVelarde, Raúl; AlarioHoyos, Carlos; RamírezGonzález, Gustavo; OrdóñezEraso, Hugo
Show all (5)
Post to Citeulike
Massive Open Online Courses (MOOC) have been considered an “educational revolution”. Although these courses were designed to reach a massive number of participants, Higher Education institutions have started to use MOOCs technologies and methodologies as a support for educative traditional practices in what has been called Small Private Online Courses (SPOCs) and Massive Private Online Courses (MPOCs) according to the proportion of students enrolled and the teachers who support them. A slightly explored area of scientific literature is the possible correlations between performance and learning styles in academic value courses designed to be offered in massively environments. This article presents the results obtained in the MPOC “Daily Astronomy” at University of Cauca in terms of the possible associations between learning styles according to Kolb and the results in the evaluations and the activity demonstrated in the services of the platform that hosted the course.
more …
