Showing 1 to 100 of 254 matching Articles
Results per page:
Export (CSV)
By
Aranha, Diego F.; Knapp, Edward; Menezes, Alfred; RodríguezHenríquez, Francisco
Show all (4)
Post to Citeulike
4 Citations
In the past year, the speed record for pairing implementations on desktopclass machines has been broken several times. The speed records for asymmetric pairings were set on a single processor. In this paper, we describe our parallel implementation of the optimal ate pairing over BarretoNaehrig (BN) curves that is about 1.23 times faster using two cores of an Intel Core i5 or Core i7 machine, and 1.45 times faster using 4 cores of the Core i7 than the stateoftheart implementation on a single core. We instantiate Hess’s general Weil pairing construction and introduce a new optimal Weil pairing tailored for parallel execution. Our experimental results suggest that the new Weil pairing is 1.25 times faster than the optimal ate pairing on 8core extensions of the aforementioned machines. Finally, we combine previous techniques for parallelizing the eta pairing on a supersingular elliptic curve with embedding degree 4, and achieve an estimated 1.24fold speedup on an 8core extension of an Intel Core i7 over the previous best technique.
more …
By
Taverne, Jonathan; FazHernández, Armando; Aranha, Diego F.; RodríguezHenríquez, Francisco; Hankerson, Darrel; López, Julio
Show all (6)
Post to Citeulike
6 Citations
The availability of a new carryless multiplication instruction in the latest Intel desktop processors significantly accelerates multiplication in binary fields and hence presents the opportunity for reevaluating algorithms for binary field arithmetic and scalar multiplication over elliptic curves. We describe how to best employ this instruction in field multiplication and the effect on performance of doubling and halving operations. Alternate strategies for implementing inversion and halftrace are examined to restore most of their competitiveness relative to the new multiplier. These improvements in field arithmetic are complemented by a study on serial and parallel approaches for Koblitz and random curves, where parallelization strategies are implemented and compared. The contributions are illustrated with experimental results improving the stateoftheart performance of halving and doublingbased scalar multiplication on NIST curves at the 112 and 192bit security levels, and a new speed record for sidechannel resistant scalar multiplication in a random curve at the 128bit security level.
more …
By
Rajsbaum, Sergio; Raynal, Michel
Post to Citeulike
Due to the advent of multicore machines, shared memory distributed computing models taking into account asynchrony and process crashes are becoming more and more important. This paper visits models for these systems and analyses their properties from a computability point of view. Among them, the base snapshot model and the iterated model are particularly investigated. The paper visits also several approaches that have been proposed to model failures (mainly the waitfree model and the adversary model) and gives also a look at the BG simulation. The aim of this survey is to help the reader to better understand the power and limits of distributed computing shared memory models.
more …
By
Taverne, Jonathan; FazHernández, Armando; Aranha, Diego F.; RodríguezHenríquez, Francisco; Hankerson, Darrel; López, Julio
Show all (6)
Post to Citeulike
13 Citations
The availability of a new carryless multiplication instruction in the latest Intel desktop processors significantly accelerates multiplication in binary fields and hence presents the opportunity for reevaluating algorithms for binary field arithmetic and scalar multiplication over elliptic curves. We describe how to best employ this instruction in field multiplication and the effect on performance of doubling and halving operations. Alternate strategies for implementing inversion and halftrace are examined to restore most of their competitiveness relative to the new multiplier. These improvements in field arithmetic are complemented by a study on serial and parallel approaches for Koblitz and random curves, where parallelization strategies are implemented and compared. The contributions are illustrated with experimental results improving the stateoftheart performance of halving and doublingbased scalar multiplication on NIST curves at the 112 and 192bit security levels and a new speed record for sidechannelresistant scalar multiplication in a random curve at the 128bit security level. The algorithms presented in this work were implemented on Westmere and Sandy Bridge processors, the latest generation Intel microarchitectures.
more …
By
Imbs, Damien; Rajsbaum, Sergio; Raynal, Michel
Post to Citeulike
5 Citations
Processes in a concurrent system need to coordinate using a shared memory or a messagepassing subsystem in order to solve agreement tasks such as, for example, consensus or set agreement. However, coordination is often needed to “break the symmetry” of processes that are initially in the same state, for example, to get exclusive access to a shared resource, to get distinct names or to elect a leader.
This paper introduces and studies the family of generalized symmetry breaking (GSB) tasks, that includes election, renaming and many other symmetry breaking tasks. Differently from agreement tasks, a GSB task is “inputless”, in the sense that processes do not propose values; the task only specifies the symmetry breaking requirement, independently of the system’s initial state (where processes differ only on their identifiers). Among various results characterizing the family of GSB tasks, it is shown that (non adaptive) perfect renaming is universal for all GSB tasks.
more …
By
SantiagoRamirez, Everardo; GonzalezFraga, J. A.; AscencioLopez, J. I.
Post to Citeulike
2 Citations
In this paper, we compare the performance of three composite correlation filters in facial recognition problem. We used the ORL (Olivetti Research Laboratory) facial image database to evaluate KLaw, MACE and ASEF filters performance. Simulations results demonstrate that KLaw nonlinear composite filters evidence the best performance in terms of recognition rate (RR) and, false acceptation rate (FAR). As a result, we observe that correlation filters are able to work well even when the facial image contains distortions such as rotation, partial occlusion and different illumination conditions.
more …
By
Acevedo, Elena; Acevedo, Antonio; Felipe, Federico
Post to Citeulike
2 Citations
A method for diagnosing Parkinson’s disease is presented. The proposal is based on associative approach, and we used this method for classifying patients with Parkinson’s disease and those who are completely healthy. In particular, AlphaBeta Bidirectional Associative Memory is used together with the modified JohnsonMöbius codification in order to deal with mixed noise. We use three methods for testing the performance of our method: LeaveOneOut, HoldOut and Kfold Cross Validation and the average obtained was of 97.17%.
more …
By
CruzBarbosa, Raúl; BautistaVillavicencio, David; Vellido, Alfredo
Post to Citeulike
The diagnostic classification of human brain tumours on the basis of magnetic resonance spectra is a nontrivial problem in which dimensionality reduction is almost mandatory. This may take the form of feature selection or feature extraction. In feature extraction using manifold learning models, multivariate data are described through a lowdimensional manifold embedded in data space. Similarities between points along this manifold are best expressed as geodesic distances or their approximations. These approximations can be computationally intensive, and several alternative software implementations have been recently compared in terms of computation times. The current brief paper extends this research to investigate the comparative ability of dimensionalityreduced data descriptions to accurately classify several types of human brain tumours. The results suggest that the way in which the underlying data manifold is constructed in nonlinear dimensionality reduction methods strongly influences the classification results.
more …
By
Coello Coello, Carlos A.
Post to Citeulike
3 Citations
This paper provides a brief introduction to the socalled multiobjective evolutionary algorithms, which are bioinspired metaheuristics designed to deal with problems having two or more (normally conflicting) objectives. First, we provide some basic concepts related to multiobjective optimization and a brief review of approaches available in the specialized literature. Then, we provide a short review of applications of multiobjective evolutionary algorithms in pattern recognition. In the final part of the paper, we provide some possible paths for future research in this area, which are promising, from the author’s perspective.
more …
By
King, Valerie; Lonargan, Steven; Saia, Jared; Trehan, Amitabh
Show all (4)
Post to Citeulike
8 Citations
We address the problem of designing distributed algorithms for large scale networks that are robust to Byzantine faults. We consider a message passing, full information model: the adversary is malicious, controls a constant fraction of processors, and can view all messages in a round before sending out its own messages for that round. Furthermore, each bad processor may send an unlimited number of messages. The only constraint on the adversary is that it must choose its corrupt processors at the start, without knowledge of the processors’ private random bits.
A good quorum is a set of O(logn) processors, which contains a majority of good processors. In this paper, we give a synchronous algorithm which uses polylogarithmic time and
$\tilde{O}(\sqrt{n})$
bits of communication per processor to bring all processors to agreement on a collection of ngood quorums, solving Byzantine agreement as well. The collection is balanced in that no processor is in more than O(logn) quorums. This yields the first solution to Byzantine agreement which is both scalable and loadbalanced in the full information model.
The technique which involves going from situation where slightly more than 1/2 fraction of processors are good and and agree on a short string with a constant fraction of random bits to a situation where all good processors agree on n good quorums can be done in a fully asynchronous model as well, providing an approach for extending the Byzantine agreement result to this model.
more …
By
Pakray, Partha; Gelbukh, Alexander; Bandyopadhyay, Sivaji
Post to Citeulike
2 Citations
We present an Answer Validation System (AV) based on Textual Entailment and Question Answering. The important features used to develop the AV system are Lexical Textual Entailment, Named Entity Recognition, QuestionAnswer type analysis, chunk boundary module and syntactic similarity module. The proposed AV system is rule based. We first combine the question and the answer into Hypothesis (H) and the Supporting Text as Text (T) to identify the entailment relation as either “VALIDATED” or “REJECTED”. The important features used for the lexical Textual Entailment module in the present system are: WordNet based unigram match, bigram match and skipgram. In the syntactic similarity module, the important features used are: subjectsubject comparison, subjectverb comparison, objectverb comparison and cross subjectverb comparison. The results obtained from the answer validation modules are integrated using a voting technique. For training purpose, we used the AVE 2008 development set. Evaluation scores obtained on the AVE 2008 test set show 66% precision and 65% FScore for “VALIDATED” decision.
more …
By
AguilarGonzález, Pablo M.; Kober, Vitaly
Post to Citeulike
1 Citations
Correlation filters for object detection and location estimation are commonly designed assuming the shape and graylevel structure of the object of interest are explicitly available. In this work we propose the design of correlation filters when the appearance of the target is given in a single training image. The target is assumed to be embedded in a cluttered background and the image is assumed to be corrupted by additive sensor noise. The designed filters are used to detect the target in an input scene modeled by the nonoverlapping signal model. An optimal correlation filter, with respect to the peaktooutput energy ratio criterion, is proposed for object detection and location estimation. We also present estimation techniques for the required parameters. Computer simulation results obtained with the proposed filters are presented and compared with those of common correlation filters.
more …
By
López Jaimes, Antonio; Coello Coello, Carlos A.; Aguirre, Hernán; Tanaka, Kiyoshi
Show all (4)
Post to Citeulike
5 Citations
In a previous work we proposed a scheme for partitioning the objective space using the conflict information of the current Pareto front approximation found by an underlying multiobjective evolutionary algorithm. Since that scheme introduced additional parameters that have to be set by the user, in this paper we propose important modifications in order to automatically set those parameters. Such parameters control the number of solutions devoted to explore each objective subspace, and the number of generations to create a new partition. Our experimental results show that the new adaptive scheme performs as good as the nonadaptive scheme, and in some cases it outperforms the original scheme.
more …
By
Fraga, Luis Gerardo; Coello Coello, Carlos A.
Post to Citeulike
2 Citations
This chapter presents a review of some of the most representative work regarding techniques and applications of evolutionary algorithms in pattern recognition. Evolutionary algorithms are a set of metaheuristics inspired on Darwins “survival of the fittest” principle which are stochastic in nature. Evolutionary algorithms present several advantages over traditional search and classification techniques, since they require less domainspecific information, are easy to use and operate on a set of solutions (the socalled population). Such advantages have made them very popular within pattern recognition (as well as in other domains) as will be seen in the review of applications presented in this chapter.
more …
By
Dani, Varsha; Moore, Cristopher
Post to Citeulike
4 Citations
We prove new lower bounds on the likely size of the maximum independent set in a random graph with a given constant average degree. Our method is a weighted version of the second moment method, where we give each independent set a weight based on the total degree of its vertices.
more …
By
Graff, Mario; Poli, Riccardo
Post to Citeulike
1 Citations
Most theoretical models of evolutionary algorithms are difficult to apply to realistic situations. In this paper, two models of evolutionary programinduction algorithms (EPAs) are proposed which overcome this limitation. We test our approach with two important classes of problems — symbolic regression and Boolean function induction — and a variety of EPAs including: different versions of genetic programming, gene expression programing, stochastic iterated hill climbing in program space and one version of cartesian genetic programming. We compare the proposed models against a practical model of EPAs we previously developed and find that in most cases the new models are simpler and produce better predictions. A great deal can also be learnt about an EPA via a simple inspection of our new models. E.g., it is possible to infer which characteristics make a problem difficult or easy for the EPA.
more …
By
Zapotecas Martínez, Saúl; Yáñez Oropeza, Edgar G.; Coello Coello, Carlos A.
Post to Citeulike
In spite of the success of evolutionary algorithms for dealing with multiobjective optimization problems (the socalled multiobjective evolutionary algorithms (MOEAs)), their main drawback is the finetuning of their parameters, which is normally done in an empirical way (using a trialanderror process for each problem at hand), and usually has a significant impact on their performance. In this paper, we present a selfadaptation methodology that can be incorporated into any MOEA, in order to allow an automatic finetuning of parameters, without any human intervention. In order to validate the proposed mechanism, we incorporate it into the NSGAII, which is a wellknown elitist MOEA and we analyze the performance of the resulting approach. The results reported here indicate that the proposed approach is a viable alternative to selfadapt the parameters of a MOEA.
more …
By
Gaxiola, Fernando; Melin, Patricia; Valdez, Fevrier; Castillo, Oscar
Show all (4)
Post to Citeulike
2 Citations
This paper presents a new modular neural network architecture that is used to build a system for pattern recognition based on the iris biometric measurement of persons. In this system, the properties of the person iris database are enhanced with image processing methods, and the coordinates of the center and radius of the iris are obtained to make a cut of the area of interest by removing the noise around the iris. The inputs to the modular neural network are the processed iris images and the output is the number of the identified person. The integration of the modules was done with a type2 fuzzy integrator at the level of the sub modules, and with a gating network at the level of the modules.
more …
By
Sánchez, Daniela; Melin, Patricia; Castillo, Oscar
Post to Citeulike
In this paper we propose a new model of a Modular Neural Network (MNN) with fuzzy integration based on granular computing. The topology and parameters of the model are optimized with a Hierarchical Genetic Algorithm (HGA). The model was applied to the case of human recognition to illustrate its applicability. The proposed method is able to divide the data automatically into sub modules, to work with a percentage of images and select which images will be used for training. We considered, to test this method, the problem of human recognition based on ear, and we used a database with 77 persons (with 4 images each person for this task).
more …
By
Rodríguez, Lisbeth; Li, Xiaoou; MejíaAlvarez, Pedro
Post to Citeulike
2 Citations
Vertical partitioning is a well known technique to improve query response time in relational databases. This consists in dividing a table into a set of fragments of attributes according to the queries run against the table. In dynamic systems the queries tend to change with time, so it is needed a dynamic vertical partitioning technique which adapts the fragments according to the changes in query patterns in order to avoid long query response time. In this paper, we propose an active system for dynamic vertical partitioning of relational databases, called DYVEP (DYnamic VErtical Partitioning). DYVEP uses active rules to vertically fragment and refragment a database without intervention of a database administrator (DBA), maintaining an acceptable query response time even when the query patterns in the database suffer changes. Experiments with the TPCH benchmark demonstrate efficient query response time.
more …
By
MoralesHernández, Luis A.; HerreraNavarro, Ana M.; ManriquezGuerrero, Federico; PeregrinaBarreto, Hayde; TerolVillalobos, Iván R.
Show all (5)
Post to Citeulike
1 Citations
Microstructure in graphite nodules plays a fundamental role in mechanical properties in cast iron. Traditional measures used to study spheroid graphite are nodules density, nodularity, volume fraction and mean size. However, sometimes these parameters do not permit a good characterization of the microstructure since they do not allow the discrimination of different regions. In fact, other measures such as size and spatial distributions enable a better understanding of mechanical properties that can be obtained either by altering certain processing variables or through various heat treatments. In the present paper a method to characterize graphite nodules microstructure based on the connectivity generated by dilations is introduced. This approach, which takes into account size and spatial distributions of graphite, permits to relate the microstructure of graphite nodules with the wear behavior.
more …
By
Rodríguez, LuisFelipe; Ramos, Félix
Post to Citeulike
Cognitive architectures allow the emergence of behaviors in autonomous agents. Their design is commonly based on multidisciplinary theories and models that explain the mechanisms underlying human behaviors, as well as on standards and principles used in the development of software systems. The formal and rigorous description of such architectures, however, is a software engineering practice that has been barely implemented. In this paper, we present the formal specification of a neuroscienceinspired cognitive architecture. Which ensures its proper development and subsequent operation, as well as the communication of its operational and architectural assumptions unambiguously. Particularly, we use the RealTime Process Algebra to formally describe its architectural components and emerging behaviors.
more …
By
SilvánCárdenas, José Luis; Wang, Le
Post to Citeulike
Building footprint geometry is a basic layer of information required by government institutions for a number of land management operations and research. LiDAR (light detection and ranging) is a laserbased altimetry measurement instrument that is flown over relatively wide land areas in order to produce digital surface models. Although high spatial resolution LiDAR measurements (of around 1 m horizontally) are suitable to detect aboveground features through elevation discrimination, the automatic extraction of buildings in many cases, such as in residential areas with complex terrain forms, has proved a difficult task. In this study, we developed a method for detecting building footprint from LiDAR altimetry data and tested its performance over four sites located in Austin, TX. Compared to another standard method, the proposed method had comparable accuracy and better efficiency.
more …
By
Semenov, Oleg; Olah, Mark J.; Stefanovic, Darko
Post to Citeulike
Molecular spiders are nanoscale walkers made with DNA enzyme legs attached to a common body. They move over a surface of DNA substrates, cleaving them and leaving behind product DNA strands, which they are able to revisit. Simple onedimensional models of spider motion show significant superdiffusive motion when the legsubstrate bindings are longerlived than the legproduct bindings. This gives the spiders potential as a fasterthandiffusion transport mechanism. However, analysis shows that singlespider motion eventually decays into an ordinary diffusive motion, owing to the ever increasing size of the region of cleaved products. Inspired by cooperative behavior of natural molecular walkers, we propose a model for multiple walkers moving collectively over a onedimensional lattice. We show that when walkers are sequentially released from the origin, the collective effect is to prevent the leading walkers from moving too far backwards. Hence there is an effective outward pressure on the leading walkers that keeps them moving superdiffusively for longer times, despite the growth of the product region.
more …
By
JoseGarcia, Adan; RomeroMonsivais, Hillel; HernandezMorales, Cindy G.; RodriguezCristerna, Arturo; RiveraIslas, Ivan; TorresJimenez, Jose
Show all (6)
Post to Citeulike
2 Citations
Cryptosystems require the computation of modular exponentiation, this operation is related to the problem of finding a minimal addition chain. However, obtaining the shortest addition chain of length n is an NPComplete problem whose search space size is proportional to n!. This paper introduces a novel idea to compute the minimal addition chain problem, through an implementation of a Simulated Annealing (SA) algorithm. The representation used in our SA is based on Factorial Number System (FNS). We use a finetuning process to get the best performance of SA using a Covering Array (CA), Diophantine Equation solutions (DE) and Neighborhood Functions (NF). We present a parallel implementation to execute the finetuning process using a Message Passing Interface (MPI) and the Single Program Multiple Data (SPMD) model. These features, allowed us to calculate minimal addition chains for benchmarks considered difficult in very short time, the experimental results show that this approach is a viable alternative to solve the solution of the minimal addition chain problem.
more …
By
MújicaVargas, Dante; GallegosFunes, Francisco Javier; RosalesSilva, Alberto J.; CruzSantiago, Rene
Show all (4)
Post to Citeulike
1 Citations
Image segmentation is a key step for many images analysis applications. So far, there does not exist a general method to segment suitable all images, regardless if these are corrupted or noise free. In this paper, we propose to modify the Fuzzy Cmeans clustering algorithm and the FCM_S1 variant by using the RMLestimator. The idea to our method is to get robust clustering algorithms able to segment images with different type and levels of noises. The performance of the proposed algorithms is tested on synthetic and real images. Experimental results show that the proposed algorithms are more robust to the noise presence and more effective than the comparative algorithms.
more …
By
NavaOrtiz, M.; GómezFlores, W.; DíazPérez, A.; ToscanoPulido, G.
Show all (4)
Post to Citeulike
Segmentation is an important step within optical character recognition systems, since the recognition rates depends strongly on the accuracy of binarization techniques. Hence, it is necessary to evaluate different segmentation methods for selecting the most adequate for a specific application. However, when gold patterns are not available for comparing the binarized outputs, the recognition rates of the entire system could be used for assessing the performance. In this article we present the evaluation of five local adaptive binarization methods for digit recognition in water meters by measuring misclassification rates. These methods were studied due to of their simplicity to be implemented in basedcamera devices, such as cell phones, with limited hardware capabilities. The obtained results pointed out that Bernsens method achieved the best recognition rates when the normalized central moments are employed as features.
more …
By
Cuaya, German; MuñozMeléndez, Angélica; Morales, Eduardo F.
Post to Citeulike
2 Citations
In many classification problems, and in particular in medical domains, it is common to have an unbalanced class distribution. This pose problems to classifiers as they tend to perform poorly in the minority class which is often the class of interest. One commonly used strategy that to improve the classification performance is to select a subset of relevant features. Feature selection algorithms, however, have not been designed to favour the classification performance of the minority class. In this paper, we present a novel filter feature selection algorithm, called FSMC, for unbalanced data sets. FSMC selects attributes that have minority class distributions significantly different from the majority class distributions. FSMC is fast, simple, selects a small number of features and outperforms in most cases other feature selection algorithms in terms of global accuracy and in terms of performance measures for the minority class such as precision, recall, Fmeasure and ROC values.
more …
By
Vilariño, Darnes; Pinto, David; Balderas, Carlos; Tovar, Mireya; León, Saul
Show all (5)
Post to Citeulike
In this paper we present the results of the evaluation of an information retrieval system constructed in the Faculty of Computer Science, BUAP. This system was used in the DataCentric track of the Initiative for the Evaluation of XML retrieval (INEX 2010). This track is focused on the extensive use of a very rich structure of the documents beyond the content. We have considered topics (queries) in two variants: Content Only (CO) and Content And Structure (CAS) of the information need. The obtained results are shown and compared with those presented by other teams in the competition.
more …
By
Guzman, Enrique; Garcia, Ivan; Manzano, Manuel
Post to Citeulike
The scientific and academic interest for image processing and analysis on autonomous systems for solving problems associated to the artificial vision, such as objects recognition, trajectory planning of robots, etc., has grown in the last years. On the other hand, the reconfigurable logic has attractive features to implement applications of artificial vision on embedded systems. In this context, this paper aims to show the design and implementation of an integral environment for implementing and evaluating algorithms for digital images processing and analysis inside a FPGA.
more …
By
MújicaVargas, Dante; GallegosFunes, Francisco J.; CruzSantiago, Rene
Post to Citeulike
1 Citations
In this paper we present an image processing scheme to segment noisy images based on a robust estimator in the filtering stage and the standard Fuzzy CMeans (FCM) clustering algorithm to segment the images. The main objective of paper is to evaluate the performance of the Rank Mtype Lfilter with different influence functions and to establish a reference base to include the filter in the objective function of FCM algorithm in a future work. The filter uses the Rank Mtype (RM) estimator in the scheme of Lfilter, to get more robustness in the presence of different types of noises and a combination of them. Tests were made on synthetic and real images subjected to three types of noise and the results are compared with six reference modified Fuzzy CMeans methods to segment noisy images.
more …
By
Velasco Elizondo, Perla; Ndjatchi, Mbe Koua Christophe
Post to Citeulike
1 Citations
An interface specification serves as the sole medium for component understanding and use. Current practice of deriving these specifications for composite components does not give much weight to doing it systematically and unambiguously. This paper presents our progress on developing an approach to tackle this issue. We focus on deriving functional interface specifications for composite components, constructed via composition operators. In our approach, the composites’ interfaces are not generated in an ad hoc manner via delegation mechanisms, but are derived systematically, consistently and largely automatically via a set of functions on the functional interfaces of the composed components. Via an example, we illustrate the aforementioned benefits as well as the fact that our approach provides a new view into the space of interface generation.
more …
By
HernandezLeal, Pablo; Sucar, L. Enrique; Gonzalez, Jesus A.; Morales, Eduardo F.; Ibarguengoytia, Pablo H.
Show all (5)
Post to Citeulike
1 Citations
Diagnosis in industrial domains is a complex problem because it includes uncertainty management and temporal reasoning. Dynamic Bayesian Networks (DBN) can deal with this type of problem, however they usually lead to complex models. Temporal Nodes Bayesian Networks (TNBNs) are an alternative to DBNs for temporal reasoning that result in much simpler and efficient models in certain domains. However, methods for learning this type of models from data have not been developed. In this paper we propose a learning algorithm to obtain the structure and temporal intervals for TNBNs from data. The method has three phases: (i) obtain an initial interval approximation, (ii) learn the network structure based on the intervals, and (iii) refine the intervals for each temporal node. The number of possible sets of intervals is obtained for each temporal node based on a clustering algorithm and the set of intervals that maximizes the prediction accuracy is selected. We applied this method to learn a TNBN for diagnosis and prediction in a combined cycle power plant. The proposed algorithm obtains a simple model with high predictive accuracy.
more …
By
Flores, Juan J.; López, Rodrigo; Barrera, Julio
Post to Citeulike
4 Citations
Evolutionary computation is inspired by nature in order to formulate metaheuristics capable to optimize several kinds of problems. A family of algorithms has emerged based on this idea; e.g. genetic algorithms, evolutionary strategies, particle swarm optimization (PSO), ant colony optimization (ACO), etc. In this paper we show a populationbased metaheuristic inspired on the gravitational forces produced by the interaction of the masses of a set of bodies. We explored the physics knowledge in order to find useful analogies to design an optimization metaheuristic. The proposed algorithm is capable to find the optima of unimodal and multimodal functions commonly used to benchmark evolutionary algorithms. We show that the proposed algorithm (Gravitational Interactions Optimization  GIO) works and outperforms PSO with niches in both cases. Our algorithm does not depend on a radius parameter and does not need to use niches to solve multimodal problems. We compare GIO with other metaheuristics with respect to the mean number of evaluations needed to find the optima.
more …
By
Sakai, Toshinori; Urrutia, Jorge
Post to Citeulike
1 Citations
Let P be a set of n points such that each of its elements has a unique weight in {1, …,n}. P is called a wpset. A noncrossing polygonal line connecting some elements of P in increasing (or decreasing) order of their weights is called a monotonic path of P. A simple polygon with vertices in P is called monotonic if it is formed by a monotonic path and an edge connecting its endpoints. In this paper we study the problem of finding large monotonic polygons and paths in wpsets. We establish some sharp bounds concerning these problems. We also study extremal problems on the number of monotonic paths and polygons of a wpset.
more …
By
Guzmán, Enrique; Jiménez, Ofelia M. C.; Pérez, Alejandro D.; Pogrebnyak, Oleksiy
Show all (4)
Post to Citeulike
1 Citations
In this paper, the authors have proposed an algorithm of segmenting grayscale image using associative memories approach. The algorithm is divided in three steps. In the first step, a set of regions (classes), where each one groups to a certain number of pixel values, is obtained. In the second step, the associative memories training phase is applied to the information obtained from first phase and an associative network, that contains the centroids group of each of the regions in which the image will be segmented, is obtained. Finally, using the associative memories classification phase, the centroid to which each pixel belongs is obtained and the image segmentation process is completed.
more …
By
Vázquez, Roberto A.; Garro, Beatriz A.
Post to Citeulike
6 Citations
Metaheuristic algorithms inspired by nature have been used in a wide range of optimization problems. These types of algorithms have gained popularity in the field of artificial neural networks (ANN). On the other hand, spiking neural networks are a new type of ANN that simulates the behaviour of a biological neural network in a more realistic manner. Furthermore, these neural models have been applied to solve some pattern recognition problems. In this paper, it is proposed the use of the particle swarm optimization (PSO) algorithm to adjust the synaptic weights of a spiking neuron when it is applied to solve a pattern classification task. Given a set of input patterns belonging to K classes, each input pattern is transformed into an input signal. Then, the spiking neuron is stimulated during T ms and the firing rate is computed. After adjusting the synaptic weights of the neural model using the PSO algorithm, input patterns belonging to the same class will generate similar firing rates. On the contrary, input patterns belonging to other classes will generate firing rates different enough to discriminate among the classes. At last, a comparison between the PSO algorithm and a differential evolution algorithm is presented when the spiking neural model is applied to solve nonlinear and real object recognition problems.
more …
By
Colores, Juan M.; GarcíaVázquez, Mireya; RamírezAcosta, Alejandro; PérezMeana, Héctor
Show all (4)
Post to Citeulike
5 Citations
During video acquisition of an automatic noncooperative biometric iris recognition system, not all the iris images obtained from the video sequence are suitable for recognition. Hence, it is important to acquire high quality iris images and quickly identify them in order to eliminate the poor quality ones (mostly defocused images) before the subsequent processing. In this paper, we present the results of a comparative analysis of four methods for iris image quality assessment to select clear images in the video sequence. The goal is to provide a solid analytic ground to underscore the strengths and weaknesses of the most widely implemented methods for iris image quality assessment. The methods are compared based on their robustness to different types of iris images and the computational effort they require. The experiments with the built database (100 videos from MBGC v2) demonstrate that the best performance scores are generated by the kernel proposed by Kang & Park. The FAR and FRR obtained are 1.6% and 2.3% respectively.
more …
By
GaliciaHaro, Sofía N.; Gelbukh, Alexander
Post to Citeulike
This paper reports research on temporal expressions. The analyzed phrases include a common temporal expression for a period of years reinforced by an adverb of time. We found that some of those phrases are agerelated expressions. We analyzed samples obtained from the Internet for Spanish and French to determine appropriate annotations for marking up text and possible translations. We present the results for a group of selected classes.
more …
By
Cairo, Osvaldo; Olarte, Juan Gabriel; Rivera, Fernando
Post to Citeulike
A number of research efforts were devoted to deploying agent technology applications in the field of AgentMediated Electronic Commerce. On the one hand, there are applications that simplify electronic transactions such as intelligent search engines and browsers, learning agents, recommender agent systems and agents that share knowledge. Thanks to the development and availability of agent software, ecommerce can use more than only telecommunications and online data processing. On the other hand, there are applications that include negotiation as part of their core activities such as the information systems field with negotiation support systems; multiagent systems field with searching, trading and negotiation agents; and market design field with electronic auctions. Although negotiation is an important business activity, it has not been studied extensively either in traditional business or in ecommerce context. This paper introduces the idea of developing an agent with negotiation capabilities applied to the Latin American market, where both the technological gap and an inappropriate approach to motivate electronic transactions are important factors. We address these issues by presenting a negotiation strategy that allows the interaction between an intelligent agent and consumers with Latin American idiosyncrasy.
more …
By
ChaconMurguia, Mario I.; PerezVargas, Francisco J.
Post to Citeulike
6 Citations
This paper presents a method to detect fire regions in thermal videos that can be used for both outdoor and indoor environments. The proposed method works with static and moving cameras. The detection is achieved through a linear weighted classifier which is based on two features. The features are extracted from candidate regions by the following process; contrast enhance by the Local Intensities Operation and candidate region selection by thermal blob analysis. The features computed from these candidate regions are; region shape regularity, determined by Wavelet decomposition analysis and region intensity saturation. The method was tested with several thermal videos showing a performance of 4.99% of false positives in nonfire videos and 75.06% of correct detection with 7.27% of false positives in fire regions. Findings indicate an acceptable performance compared with other methods because this method unlike other works with moving camera videos.
more …
By
CastroSánchez, Noé Alejandro; Sidorov, Grigori
Post to Citeulike
2 Citations
In this paper we present an automatic method for extraction of synonyms of verbs from an explanatory dictionary based only on hyponym/hyperonym relations existing between the verbs defined and the genus used in their definitions. The set of pairs verbgenus can be considered as a directed graph, so we applied an algorithm to identify cycles in these kind of structures. We found that some cycles represent chains of synonyms. We obtain high precision and low recall.
more …
By
Ita, Guillermo; Bautista, César; Altamirano, Luis C.
Post to Citeulike
The 3Colouring of a graph is a classic NPcomplete problem. We show that some solutions for the 3Colouring can be built in polynomial time based on the number of basic cycles existing in the graph. For this, we design a reduction from proper 3Colouring of a graph G to a 2CF Boolean formula F_{G}, where the number of clauses in F_{G} depends on the number of basic cycles in G. Any model of F_{G} provides a proper 3Colouring of G. Thus, F_{G} is a logical pattern whose models codify proper 3Colouring of the graph G.
more …
By
VázquezSantacruz, Eduardo; Chakraborty, Debrup
Post to Citeulike
In this paper we present a new method to create neural network ensembles. In an ensemble method like bagging one needs to train multiple neural networks to create the ensemble. Here we present a scheme to generate different copies of a network from one trained network, and use those copies to create the ensemble. The copies are produced by adding controlled noise to a trained base network. We provide a preliminary theoretical justification for our method and experimentally validate the method on several standard data sets. Our method can improve the accuracy of a base network and give rise to considerable savings in training time compared to bagging.
more …
By
OrtizBayliss, José Carlos; TerashimaMarín, Hugo; ConantPablos, Santiago Enrique
Post to Citeulike
1 Citations
Hyperheuristics are methodologies used to choose from a set of heuristics and decide which one to apply given some properties of the current instance. When solving a Constraint Satisfaction Problem, the order in which the variables are selected to be instantiated has implications in the complexity of the search. We propose a neural network hyperheuristic approach for variable ordering within Constraint Satisfaction Problems. The first step in our approach requires to generate a pattern that maps any given instance, expressed in terms of constraint density and tightness, to one adequate heuristic. That pattern is later used to train various neural networks which represent hyperheuristics. The results suggest that neural networks generated through this methodology represent a feasible alternative to code hyperheuristic which exploit the strengths of the heuristics to minimise the cost of finding a solution.
more …
By
ZamudioFuentes, Luis Miguel; GarcíaVázquez, Mireya S.; RamírezAcosta, Alejandro Alvaro
Post to Citeulike
1 Citations
Recent researches on iris recognition without user cooperation have introduced videobased iris capturing approach. Indeed, it provides more information and more flexibility in the image acquisition stage for noncooperative iris recognition systems. However, a video sequence can contain images with different level of quality. Therefore, it is necessary to select the highest quality images from each video to improve iris recognition performance. In this paper, we propose as part of a video quality assessment module, a new local quality iris image method based on spectral energy analysis. This approach does not require the iris region segmentation to determine the quality of the image such as most of existing approaches. In contrast to other methods, the proposed algorithm uses a significant portion of the iris region to measure the quality in that area. This method evaluates the energy of 1000 images which were extracted from 200 iris videos from the MBGC NIR video database. The results show that the proposed method is very effective to assess the quality of the iris information. It obtains the highest 2 images energies as the best 2 images from each video in 226 milliseconds.
more …
By
Mata, Felix; Claramunt, Christophe
Post to Citeulike
4 Citations
Geographical data commonly available on the Web is often related to thematic and temporal data, and represented in heterogeneous data sources. This makes geographical information retrieval tasks non straightforward processes as the information available is often unstructured. The research presented in this paper introduces an ontologydriven approach whose objective is to facilitate retrieval of geographical information on the Web. Spatio temporal queries are categorized in order to solve questions related to several complementary modeling dimensions: when, what andwhere. The retrieval strategy is based on the exploration of a spatiotemporal ontology, and implemented by a search engine. The whole approach is exemplified by a prototype development, and the results obtained are evaluated by a metrics based on recall and precision measures. The experimental findings show that the strategy is effective for the types of spatiotemporal queries used by our geographical retrieval approach.
more …
By
Tellez, Eric Sadit; Chavez, Edgar; Graff, Mario
Post to Citeulike
1 Citations
Efficiently searching for patterns in very large collections of objects is a very active area of research. Over the last few years a number of indexes have been proposed to speed up the searching procedure. In this paper, we introduce a novel framework (the Knearest references) in which several approximate proximity indexes can be analyzed and understood. The search spaces where the analyzed indexes work span from vector spaces, general metric spaces up to general similarity spaces.
The proposed framework clarify the principles behind the searching complexity and allows us to propose a number of novel indexes with high recall rate, low search time, and a linear storage requirement as salient characteristics.
more …
By
CruzTechica, Sonia; EscalanteRamirez, Boris
Post to Citeulike
1 Citations
The Hermite transform is introduced as an image representation model for multiresolution image fusion with noise reduction. Image fusion is achieved by combining the steered Hermite coefficients of the source images, then the coefficients are combined with a decision rule based on the linear algebra through a measurement of the linear dependence. The proposed algorithm has been tested on both multifocus and multimodal image sets producing results that exceed results achieved with other methods such as wavelets, curvelets [11], and contourlets [2] proving that our scheme best characterized important structures of the images at the same time that the noise was reduced.
more …
By
Dwivedi, Vishal; VelascoElizondo, Perla; Maria Fernandes, Jose; Garlan, David; Schmerl, Bradley
Show all (5)
Post to Citeulike
2 Citations
Computations are pervasive across many domains, where end users have to compose various heterogeneous computational entities to perform professional activities. ServiceOriented Architecture (SOA) is a widely used mechanism that can support such forms of compositions as it allows heterogeneous systems to be wrapped as services that can then be combined with each other. However, current SOA orchestration languages require writing scripts that are typically too lowlevel for end users to write, being targeted at professional programmers and business analysts. To address this problem, this paper proposes a composition approach based on an end user specification style called SCORE. SCORE is an architectural style that uses highlevel constructs that can be tailored for different domains and automatically translated into executable constructs by tool support. We demonstrate the use of SCORE in two domains  dynamic network analysis and neuroscience, where users are intelligence analysts and neuroscientists respectively, who use the architectural style based vocabulary in SCORE as a basis of their domainspecific compositions that can be formally analyzed.
more …
By
MoyaSánchez, E. Ulises; VázquezSantacruz, Eduardo
Post to Citeulike
A new bioinspired model is proposed in this paper. This model mimetizes the simple cells of the mammalian visual processing system in order to recognize lowlevel geometric structures such as oriented lines, edges and other constructed with these. It takes advantage of geometric algebra in order to represent structures and symmetric operators by estimating the relation between geometric entities and encoding it. This geometric model uses symmetric relations in which exist a invariance under some transformation according to biological models. It is based on a Quaternionic Atomic Function and its phase information to detect oriented lines, edges and geometric structures defined by lines. Also, it uses a geometric neural network to encode the transformation between lines and then for classifying of geometric structures.
more …
By
Garza Castañón, Luis E.; Guillén, Deneb Robles; MoralesMenendez, Ruben
Post to Citeulike
A fault diagnosis framework for electrical power transmission networks, which combines Hybrid Bayesian Networks (HBN) and Wavelets is proposed. HBN is a probabilistic graphical model in which discrete and continuous data are analyzed. In this work, power network’s protection breakers are modeled as discrete nodes, and information extracted from voltages measured in every electrical network node represent the continuous nodes. Protection breakers are devices with the function to isolate faulty nodes by opening the circuit, and are considered to be working in one of three states: OK, OPEN, and FAIL. On the other hand, node voltages data are processed with wavelets, delivering specific coefficients patterns which are encoded into probability distributions of continuous HBN nodes. Experimental tests show a good performance of the diagnostic system when simultaneous multiple faults are simulated in a 24 nodes electrical network, in comparison with a previous approach in the same domain.
more …
By
MartíFarré, Jaume; Padró, Carles; Vázquez, Leonor
Post to Citeulike
3 Citations
The complexity of a secret sharing scheme is defined as the ratio between the maximum length of the shares and the length of the secret. This paper deals with the open problem of optimizing this parameter for secret sharing schemes with general access structures. Specifically, our objective is to determine the optimal complexity of the access structures with exactly four minimal qualified subsets. Lower bounds on the optimal complexity are obtained by using the known polymatroid technique in combination with linear programming. Upper bounds are derived from decomposition constructions of linear secret sharing schemes. In this way, the exact value of the optimal complexity is determined for several access structures in that family. For the other ones, we present the best known lower and upper bounds.
more …
By
Barba, Luis; Korman, Matias; Langerman, Stefan; Silveira, Rodrigo I.
Show all (4)
Post to Citeulike
1 Citations
We present several algorithms for computing the visibility polygon of a simple polygon
$\ensuremath{\mathcal{P}}$
from a viewpoint inside the polygon, when the polygon resides in readonly memory and only few working variables can be used. The first algorithm uses a constant number of variables, and outputs the vertices of the visibility polygon in
$O(n\ensuremath{\bar{r}})$
time, where
$\ensuremath{\bar{r}}$
denotes the number of reflex vertices of
$\ensuremath{\mathcal{P}}$
that are part of the output. The next two algorithms use O(logr) variables, and output the visibility polygon in O(nlogr) randomized expected time or O(nlog^{2}r) deterministic time, where r is the number of reflex vertices of
$\ensuremath{\mathcal{P}}$
.
more …
By
RosalesPérez, Alejandro; ReyesGarcía, Carlos A.; GómezGil, Pilar
Post to Citeulike
6 Citations
In this paper we describe a genetic fuzzy relational neural network (FRNN) designed for classification tasks. The genetic part of the proposed system determines the best configuration for the fuzzy relational neural network. Besides optimizing the parameters for the FRNN, the fuzzy membership functions are adjusted to fit the problem. The system is tested in several infant cry database reaching results up to 97.55%. The design and implementation process as well as some experiments along with their results are shown.
more …
By
Camiña, Benito; Monroy, Raúl; Trejo, Luis A.; Sánchez, Erika
Show all (4)
Post to Citeulike
3 Citations
Given that information is an extremely valuable asset, it is vital to timely detect whether one’s computer (session) is being illegally seized by a masquerader. Masquerade detection has been actively studied for more than a decade, especially after the seminal work of Schonlau’s group, who suggested that, to profile a user, one should model the history of the commands she would enter into a UNIX session. Schonlau’s group have yielded a masquerade dataset, which has been the standard for comparing masquerade detection methods. However, the performance of these methods is not conclusive, and, as a result, research on masquerade detection has resorted to other sources of information for profiling user behaviour. In this paper, we show how to build an accurate user profile by looking into how the user structures her own file system and how she navigates such structure. While preliminary, our results are encouraging and suggest a number of ways in which new methods can be constructed.
more …
By
RenteriaAgualimpia, Walter; Levashkin, Sergei
Post to Citeulike
1 Citations
The geospatial semantic web development requires search mechanisms to overcome the syntactic comparisons and perform semantic analysis in order to retrieve information conceptually similar to the searched one by the user. This would allow reducing the risk of return empty results no match found when there is no exact correspondence between the query and the information available in data repositories. In this work, we describe an information retrieval model to integrate a semantic criterion with geospatial criteria in a no homogenous vector space. The criteria represent the dimensions of this space; these dimensions are weighted in function of the user’s preferences or his/her profile. The integration is based on a mathematical expression to evaluate the relevance of each result. We present a system that implements the geospatial semantic approach proposed to retrieve information from the domain of cultural tourism, specifically museums. The results show the advantages of integrating geospatial and semantic criteria taking into account user profiles to offer more customized (personalized) service.
more …
By
Barrientos, Mario; Madrid, Humberto
Post to Citeulike
This work introduces a new technique for edge detection based on a graph theory tool known as normalized cut. The problem involves to find certain eigenvector of a matrix called normalized laplacian, which is constructed in such way that it represents the relation of color and distance between the image’s pixels. The matrix dimensions and the fact that it is dense represents a trouble for the common eigensolvers. The power method seemed a good option to tackle this problem. The first results were not very impressive, but a modification of the function that relates the image pixels lead us to a more convenient laplacian structure and to a segmentation result known as edge detection. A deeper analysis showed that this procedure does not even need of the power method, because the eigenvector that defines the segmentation can be obtained with a closed form.
more …
By
Jiménez Vargas, Sergio; Gelbukh, Alexander
Post to Citeulike
1 Citations
Soft cardinality (SC) is a softened version of the classical cardinality of set theory. However, given its prohibitive cost of computing (exponential order), an approximation that is quadratic in the number of terms in the text has been proposed in the past. SC Spectra is a new method of approximation in linear time for text strings, which divides text strings into consecutive substrings (i.e., qgrams) of different sizes. Thus, SC in combination with resemblance coefficients allowed the construction of a family of similarity functions for text comparison. These similarity measures have been used in the past to address a problem of entity resolution (name matching) outperforming SoftTFIDF measure. SC spectra method improves the previous results using less time and obtaining better performance. This allows the new method to be used with relatively large documents such as those included in classic information retrieval collections. SC spectra method exceeded SoftTFIDF and cosine tfidf baselines with an approach that requires no term weighing.
more …
By
Konigsberg, Zvi Retchkiman
Post to Citeulike
The main objective and contribution of this paper consists in using a formal and mathematical approach to prove that parallel computer processing systems are better than serial computer processing systems, better related to: saving time and/or money and being able to solve larger problems. This is achieved thanks to the theory of Lyapunov stability and maxplus algebra applied to discrete event systems modeled with time Petri nets.
more …
By
Vilariño, Darnes; Pinto, David; Balderas, Carlos; Tovar, Mireya; Beltrán, Beatriz; Paniagua, Sofia
Show all (6)
Post to Citeulike
Detection of discriminant terms allow us to improve the performance of natural language processing systems. The goal is to be able to find the possible term contribution in a given corpus and, thereafter, to use the terms of high contribution for representing the corpus. In this paper we present various experiments that use elliptic curves with the purpose of discovering discriminant terms of a given textual corpus. Different experiments led us to use the mean and variance of the corpus terms for determining the parameters of a Weierstrass reduced equation (elliptic curve). We use the elliptic curves in order to graphically visualize the behavior of the corpus vocabulary. Thereafter, we use the elliptic curve parameters in order to cluster those terms that share characteristics. These clusters are then used as discriminant terms in order to represent the original document collection. Finally, we evaluated all these corpus representations in order to determine those terms that best discrimine each document.
more …
By
Sheremetov, Leonid; Cosultchi, Ana; Batyrshin, Ildar; VelascoHernandez, Jorge
Show all (4)
Post to Citeulike
Several pattern recognition techniques are applied for hydrogeological modeling of mature oilfields. Principle component analysis and clustering have become an integral part of microarray data analysis and interpretation. The algorithmic basis of clustering – the application of unsupervised machinelearning techniques to identify the patterns inherent in a data set – is well established. This paper discusses the motivations for and applications of these techniques to integrate water production data with other physicochemical information in order to classify the aquifers of an oilfield. Further, two time series pattern recognition techniques for basic water cut signatures are discussed and integrated within the methodology for water breakthrough mechanism identification.
more …
By
Ayala, Francisco J.; Herrera, Abel
Post to Citeulike
In this article it is described the process to extract a set of cepstral coefficients from a warped frequency space (mel and bark) and analyze the perceived differences in the reconstructed signal. We will try to determine if there is any audible improvement between these two most used scales for the purpose of speech analysis by synthesis. We will use the same procedure for parameter extraction and signal reconstruction for both functions, replacing only the warping scale. The proposed system is based on a basic cepstral analysis synthesis model on the mel scale, whose excitation signal generation process has been changed. The inverse MLSA filter was obtained in order to generate the analysis signal, then this signal is fed into a wavelet decomposition block and the resultant coefficients are sent to the decoding system where the excitation signal is reconstructed. Furthermore the mel scale is replaced by bark scale.
more …
By
Figueroa Mora, Karina; Paredes, Rodrigo; Rangel, Roberto
Post to Citeulike
Modeling proximity searching problems in a metric space allows one to approach many problems in different areas, e.g. pattern recognition, multimedia search, or clustering. Recently there was proposed the permutation based approach, a novel technique that is unbeatable in practice but difficult to compress. In this article we introduce an improvement on that metric space search data structure. Our technique shows that we can compress the permutation based algorithm without loosing precision. We show experimentally that our technique is competitive with the original idea and improves it up to 46% in real databases.
more …
By
LizarragaMorales, Rocio A.; SanchezYanez, Raul E.; AyalaRamirez, Victor
Post to Citeulike
1 Citations
Texel size determination on periodic and nearperiodic textures, is a problem that has been addressed for years, and currently it remains as an important issue in structural texture analysis. This paper proposes an approach to determine the texel size based on the computation and analysis of the texture homogeneity properties. We analyze the homogeneity feature computed from difference histograms, while varying the displacement vector for a preferred orientation. As we vary this vector, we expect a maximum value in the homogeneity data if its magnitude matches the texel size in a given orientation. We show that this approach can be used for both periodic and nearperiodic textures, it is robust to noise and blur perturbations, and its advantages over other approaches in computation time and memory storage.
more …
By
Nguyen, NgocHieu; Son, Tran Cao; Pontelli, Enrico; Sakama, Chiaki
Show all (4)
Post to Citeulike
1 Citations
This paper describes a platform to develop negotiating agents, whose knowledge and rules of behavior are represented as Abductive Logic Programs. The platform implements a flexible negotiation framework. Negotiating agents can operate with multiple goals and incomplete knowledge, and dynamically modify their goals depending on the progress of the negotiation exchanges. Differently from other frameworks, agents can operate dishonestly, by generating false statements or statements that are not substantiated by the agents’ knowledge. The proposed platform has been implemented using the ASPProlog platform.
more …
By
RosasRomero, Roberto; Starostenko, Oleg; RodríguezAsomoza, Jorge; AlarconAquino, Vicente
Show all (4)
Post to Citeulike
This paper presents a novel approach for registration of 3D images based on optimal freeform rigid transformation. A proposal consists in semiautomatic image segmentation reconstructing 3D object surfaces in medical images. The proposed extraction technique employs gradients in sequences of 3D medical images to attract a deformable surface model by using imaging planes that correspond to multiple locations of feature points in space, instead of detecting contours on each imaging plane in isolation. Feature points are used as a reference before and after a deformation. An issue concerning this relation is difficult and deserves attention to develop a methodology to find the optimal number of points that gives the best estimates and does not sacrifice computational speed. After generating a representation for each of two 3D objects, we find the best similarity transformation that represents the object deformation between them. The proposed approach has been tested using different imaging modalities by morphing data from Histology sections to match MRI of carotid artery.
more …
By
Garcia, I.; Pacheco, C.; Cruz, D.
Post to Citeulike
Under a Software Process Improvement (SPI) environment, all phases of a process improvement initiative involving establishing commitment, assessment or diagnosing, improvement plans generation, pilot implementation and improvements deployment, may be accomplished collaboratively by different groups inside an enterprise. Organizational, technical and processbased circumstances have an impact on process assessment and modeling practices. Based on a Modelbased Collaborative Design, a strategy for collaborative process assessment and modeling is proposed. This collaborative support helps project managers in overcoming complexities and to create a common understanding of the process and products of a SPI initiative. Finally, a Rich Internet Application (RIA) is developed and applied to provide strong support for distributed project managers, to collaboratively assess and model their software process within a SPI project, the CEForSPI (Collaborative Environment For Software Process Improvement) prototype. This tool represents a collaborative strategy to support SPI teams in handling the different phases of a typical SPI lifecycle.
more …
By
OrtizBayliss, José Carlos; TerashimaMarín, Hugo; Özcan, Ender; Parkes, Andrew J.; ConantPablos, Santiago Enrique
Show all (5)
Post to Citeulike
Constraint Satisfaction Problems (CSP) represent an important topic of study because of their many applications in different areas of artificial intelligence and operational research. When solving a CSP, the order in which the variables are selected to be instantiated and the order of the corresponding values to be tried affect the complexity of the search. Hyperheuristics are flexible methods that provide generality when solving different problems and, within CSP, they can be used to determine the next variable and value to try. They select from a set of lowlevel heuristics and decide which one to apply at each decision point according to the problem state. This study explores a hyperheuristic model for variable and value ordering within CSP based on a decision matrix hyperheuristic that is constructed by going into a local improvement method that changes small portions of the matrix. The results suggest that the approach is able to combine the strengths of different lowlevel heuristics to perform well on a wide range of instances and compensate for their weaknesses on specific instances.
more …
By
RodríguezGonzález, Ansel Yoan; MartínezTrinidad, José Francisco; CarrascoOchoa, Jesús Ariel; RuizShulcloper, José
Show all (4)
Post to Citeulike
9 Citations
Most of the current algorithms for mining frequent patterns assume that two object subdescriptions are similar if they are equal, but in many realworld problems some other ways to evaluate the similarity are used. Recently, three algorithms (ObjectMiner, STreeDCMiner and STreeNDCMiner) for mining frequent patterns allowing similarity functions different from the equality have been proposed. For searching frequent patterns, ObjectMiner and STreeDCMiner use a pruning property called Downward Closure property, which should be held by the similarity function. For similarity functions that do not meet this property, the STreeNDCMiner algorithm was proposed. However, for searching frequent patterns, this algorithm explores all subsets of features, which could be very expensive. In this work, we propose a frequent similar pattern mining algorithm for similarity functions that do not meet the Downward Closure property, which is faster than STreeNDCMiner and loses fewer frequent similar patterns than ObjectMiner and STreeDCMiner. Also we show the quality of the set of frequent similar patterns computed by our algorithm with respect to the quality of the set of frequent similar patterns computed by the other algorithms, in a supervised classification context.
more …
By
JuárezRamírez, Reyes; Licea, Guillermo; Barriba, Itzel; Izquierdo, Víctor; Ángeles, Alfonso
Show all (5)
Post to Citeulike
1 Citations
Mobile applications have a great proliferation nowadays. Their usage varies from personal applications to enterprise systems. Even though there has been proliferation, the development of mobile applications confronts some limitations and faces particular challenges. The usability is one of the main domains to attend in such systems. To improve this quality attribute we suggest incorporating to the software development process best practices from other disciplines, such as usability engineering and humancomputer interaction. On the other hand, it’s important to incorporate studies that assist to identify requirements of each mobile device capability to offer services in usable manner. In this paper we present a proposal to apply user oriented analysis and design, emphasizing specific practices, such as user and task analysis. Also, we present a case of study consisting in a mobile application for the iPhone device, which allows us to prove our proposal.
more …
By
TudónMartínez, Juan C.; MoralesMenendez, Ruben; GarzaCastañón, Luis; RamirezMendoza, Ricardo
Show all (4)
Post to Citeulike
1 Citations
Dynamic Principal Component Analysis (DPCA) and Artificial Neural Networks (ANN) are compared in the fault diagnosis task. Both approaches are process history based methods, which do not assume any form of model structure, and rely only on process historical data. Faults in sensors and actuators are implemented to compare the online performance of both approaches in terms of quick detection, isolability capacity and multiple faults identifiability. An industrial heat exchanger was the experimental testbed system. Multiple faults in sensors can be isolated using an individual control chart generated by the principal components; the error of classification was 15.28% while ANN presented 4.34%. For faults in actuators, ANN showed instantaneous detection and 14.7% lower error classification. However, DPCA required a minor computational effort in the training step.
more …
By
Naredo, Enrique; Castillo, Oscar
Post to Citeulike
5 Citations
We describe the use of Ant Colony Optimization (ACO) for the ball and beam control problem, in particular for the problem of tuning a fuzzy controller of the Sugeno type. In our case study the controller has four inputs, each of them with two membership functions; we consider the intersection point for every pair of membership functions as the main parameter and their individual shape as secondary ones in order to achieve the tuning of the fuzzy controller by using an ACO algorithm. Simulation results show that using ACO and coding the problem with just three parameters instead of six, allows us to find an optimal set of membership function parameters for the fuzzy control system with less computational effort needed.
more …
By
Aguilar, José Alfonso; Garrigós, Irene; Mazón, JoseNorberto
Post to Citeulike
Due to the continuous changes and heterogeneous audience of the Web, a requirement engineering stage is crucial for Web development. Importantly, this stage should consider that Web applications are more likely to rapidly evolve during the development process, thus leading to inconsistencies among requirements. Therefore, Web developers need to know dependencies among requirements to ensure that Web applications finally satisfy the audience. The understanding of requirement dependencies also helps in better managing and maintaining Web applications. In this work, an algorithm has been defined in order to deal with dependencies among functional and nonfunctional requirements to understand which is the impact of making changes when developing a Web application.
more …
By
Garro, Beatriz A.; Sossa, Humberto; Vázquez, Roberto A.
Post to Citeulike
6 Citations
Due to their efficiency and adaptability, bioinspired algorithms have shown their usefulness in a wide range of different nonlinear optimization problems. In this paper, we compare two ways of training an artificial neural network (ANN): Particle Swarm Optimization (PSO) and Differential Evolution (DE) algorithms. The main contribution of this paper is to show which of these two algorithms provides the best accuracy during the learning phase of an ANN. First of all, we explain how the ANN training phase could be seen as an optimization problem. Then, we explain how PSO and DE could be applied to find the best synaptic weights of the ANN. Finally, we perform a comparison between PSO and DE approaches when used to train an ANN applied to different nonlinear problems.
more …
By
Ledeneva, Yulia; Hernández, René García; Soto, Romyna Montiel; Reyes, Rafael Cruz; Gelbukh, Alexander
Show all (5)
Post to Citeulike
3 Citations
Automatic text summarization has emerged as a technique for accessing only to useful information. In order to known the quality of the automatic summaries produced by a system, in DUC 2002 (Document Understanding Conference) has developed a standard human summaries called gold collection of 567 documents of single news. In this conference only five systems could outperforms the baseline heuristic in single extractive summarization task. So far, some approaches have got good results combining different strategies with languagedependent knowledge. In this paper, we present a competitive method based on an EM clustering algorithm for improving the quality of the automatic summaries using practically non languagedependent knowledge. Also, a comparison of this method with three text models is presented.
more …
By
Dalmau, Oscar; Alarcon, Teresa
Post to Citeulike
3 Citations
Blood vessel extraction is an important step for abnormality detection and for obtaining good retinopathy diabetic diagnosis in digital retinal images. The use of filter bank has shown to be a powerful technique for detecting blood vessels. In particular, the Matched Filter is appropriate and efficient for this task and in combination with other methods the blood vessel detection can be improved. We propose a combination of the Matched Filter with a segmentation strategy by using a Cellular Automata. The strategy presented here is very efficient and experimentally yields competitive results compared with others methods of the state of the art.
more …
By
Argotte, Liliana; Hernandez, Yasmin; ArroyoFigueroa, G.
Post to Citeulike
1 Citations
Training of operators has become an important problem to be faced by power systems: updating knowledge and skills. An operator must comprehend the physical operation of the process and must be skilled in handling a number of normal and abnormal operating problems and emergencies. We are developing an intelligent environment for training of power system operators. This paper presents the architecture of the intelligent environment composed by reusable learning objects, concept structure maps, operator cognitive and affective model, tutor and adaptive sequence, and learning interface. The operator model and adaptive sequence are represented by probabilistic networks that select the best pedagogical and affective action for each specific operator. The model was evaluated using scholar environments with good results. The general aim of our work is to provide operators of complex industrial environments with a suitable training from a pedagogical and affective viewpoint to certify operators in knowledge.
more …
By
Neme, Antonio; Hernández, Sergio; Neme, Omar
Post to Citeulike
We propose the use of selforganizing maps as models of social processes, in particular, of electoral preferences. In some voting districts patterns of electoral preferences emerge, such that in nearby areas citizens tend to vote for the same candidate whereas in geographically distant areas the most voted candidate is that whose political position is distant to the latter. Those patterns are similar to the spatial structure achieved by selforganizing maps. This model is able to achieve spatial order from disorder by forming a topographic map of the external field, identified with advertising from the media. Here individuals are represented in two spaces: a static geographical location, and a dynamic political position. The modification of the later leads to a pattern in which both spaces are correlated.
more …
By
Maheswaran, Rajiv T.; Szekely, Pedro; Sanchez, Romeo
Post to Citeulike
2 Citations
We address multiagent planning problems in dynamic environments motivated by assisting human teams in disaster emergency response. It is challenging because most goals are revealed during execution, where uncertainty in the duration and outcome of actions plays a significant role, and where unexpected events can cause large disruptions to existing plans. The key to our approach is giving human planners a rich strategy language to constrain the assignment of agents to goals and allow the system to instantiate the strategy during execution, tuning the assignment to the evolving execution state. Our approach outperformed an extensivelytrained team coordinating with radios and a traditional commandcenter organization, and an agentassisted team using a different approach.
more …
By
PozosParra, Pilar; Perrussel, Laurent; Thevenin, Jean Marc
Post to Citeulike
Belief merging aims to conciliate multiple possibly inconsistent belief bases into a consistent common belief base. To handle inconsistency some operators have been proposed. Most of them do not consider inconsistent bases. PSMerge is an alternative method of merging that uses the notion of Partial Satisfiability and allows us to take into account inconsistent bases. PSMerge needs the bases represented as DNF formulas, nevertheless, many practical problems are easily represented in its CNF. The aim of this paper is to extend the notion of Partial Satisfiability in order to consider bases represented as CNF formulas. Moreover, we consider Prime Normal forms in order to define a method that allows us to implement PSMerge for difficult theories. We also show that once the belief bases are represented as sets of normal forms, PSMerge is polynomial.
more …
By
GalanHernandez, J. C.; AlarconAquino, V.; Starostenko, O.; RamirezCortes, J. M.
Show all (4)
Post to Citeulike
1 Citations
Region of interest (ROI) based compression can be applied to realtime video transmission in medical or surveillance applications where certain areas are needed to retain better quality than the rest of the image. The use of a fovea combined with ROI for image compression can help to improve the perception of quality and preserve different levels of detail around the ROI. In this paper, a foveaROI compression approach is proposed based on the Set Partitioning In Hierarchical Tree (SPIHT) algorithm. Simulation results show that the proposed approach presents better details in objects inside the defined ROI than the standard SPIHT algorithm.
more …
By
Ruiz, Elias; Melendez, Augusto; Sucar, L. Enrique
Post to Citeulike
A novel approach to create a general vision system is presented. The proposed method is based on a visual grammar representation which is transformed to a Bayesian network which is used for object recognition. We use a symbolrelational grammar for a hierarchical description of objects, incorporating spatial relations. The structure of a Bayesian network is obtained automatically from the grammar, and its parameters are learned from examples. The method is illustrated with two examples for face recognition.
more …
By
FloresPulido, Leticia; Starostenko, Oleg; RodríguezGómez, Gustavo; PortillaFlores, Alberto; MoraLumbreras, Marva Angelica; AlboresVelasco, Francisco Javier; Sánchez, Marlon Luna; Cuamatzi, Patrick Hernández
Show all (8)
Post to Citeulike
In this paper, the analysis of similarity metrics used for performance evaluation of image retrieval frameworks is provided. Image retrieval based on similarity metrics obtains remarkable results in comparison with robust discrimination methods. Thus, the similarity metrics are used in matching process between visual query from user and descriptors of images in preprocessed collection. In contrast, the discrimination methods usually compare feature vectors computing distances between visual query and images in collections. In this research, a behavior of spline radial basis function used as metric for image similarity measurement is proposed and evaluated, comparing it with discrimination methods, particularly with general principal component analysis algorithm (GPCA). Spline radial basis function has been tested in image retrieval using a standard image collections, such as COIL100. The obtained results using spline radial basis function report 88% of correct image retrieval avoiding a classification phase required in other wellknown methods. The discussion of tests with designed Image Data Segmentation with Spline (IDSS) framework illustrates optimization and improvement of image retrieval process.
more …
By
Escalante, Hugo Jair; MontesyGómez, Manuel; Solorio, Thamar
Post to Citeulike
1 Citations
This paper introduces a new similarity measure called weighted profile intersection (WPI) for profilebased authorship attribution (PBAA). Authorship attribution (AA) is the task of determining which, from a set of candidate authors, wrote a given document. Under PBAA an author’s profile is created by combining information extracted from sample documents written by the author of interest. An unseen document is associated with the author whose profile is most similar to the document. Although competitive performance has been obtained with PBAA, the method is limited in that the most used similarity measure only accounts for the number of overlapping terms among test documents and authors’ profiles. We propose a new measure for PBAA, WPI, which takes into account an interauthor term penalization factor, besides the number of overlapping terms. Intuitively, in WPI we rely more on those terms that are (frequently) used by the author of interest and not (frequently) used by other authors when computing the similarity of the author’s profile and a test document. We evaluate the proposed method in several AA data sets, including many data subsets from Twitter. Experimental results show that the proposed technique outperforms the standard PBAA method in all of the considered data sets; although the baseline method resulted very effective. Further, the proposed method achieves performance comparable to classifierbased AA methods (e.g., methods based on SVMs), which often obtain better classification results at the expense of limited interpretability and a higher computational cost.
more …
By
Neme, Antonio; Hernández, Leticia
Post to Citeulike
2 Citations
Air pollution in big cities is a major health problem. Pollutants in the air may have severe consequences in humans, creating conditions for several illness and also affect tissues and organs, and also affect other animals and crop productivity. From several years now, the air quality has been monitored by stations distributed over major cities, and the concentration of several pollutants is measured. From these data sets, and applying the data visualization capabilities of the selforganized map, we analyzed the air quality in Mexico City. We were able to detect some hidden patterns regarding the pollutant concentration, as well as to study the evolution of air quality from 2003 to 2010.
more …
By
Rodríguez, LuisFelipe; Ramos, Félix; García, Gregorio
Post to Citeulike
Cognitive architectures are integrative frameworks that include a series of components that interoperate to generate a variety of behaviors in autonomous agents. Commonly, such components attempt to synthesize the operations and architecture of brain functions, such as perception and emotions. To carry out this, they embody computational models whose development is based on theories explaining cognitive and affective functions as well as on principles and standards established for the development of software systems. Unfortunately, such theories and software principles are not always available or entirely adequate. In this paper, we analyze and discuss fundamental issues associated to the development of these type of architectural components. We focus on the problems that arise throughout their development cycle and identify some improvements for the tools used in their construction.
more …
By
IbarraManzano, MarioAlberto; AlmanzaOjeda, DoraLuz
Post to Citeulike
2 Citations
This article presents an architecture based on FPGA for the calculation of texture attributes using an adequacy of the technique of sum and differences of histograms. The attributes calculated by this architecture will be used in a process of classification for identification of objects during the navigation of an autonomous robot of service. Because of that, the constraint of realtime execution plays an essential role during the architecture design. So, the architecture is designed to calculate 30 dense images with 6 different attributes of texture for 10 different displacements. Exploiting the reuse of operations in parallel on the FPGA and taking into account the requisites in the time of calculation, it is possible to use the resources in an efficient and optimised way in order to obtain an architecture with the best trade off between resources and the time of calculation. Thanks to the high performance of this architecture, it can be used in applications like medical diagnosis or target detection.
more …
By
Pastrana Palma, Alberto; Muñoz, Juan Francisco Reyes; Pérez, Luis Rodrigo Valencia; Aguilar, Juan Manuel Peña; Álvarez, Alberto Lamadrid
Show all (5)
Post to Citeulike
Computer Assisted Diagnosis (CAD) is rapidly reaching worldwide acceptance in different fields of medicine. Particularly, CAD has found one of its main applications in breast cancer diagnosis where the detection of microcalcifications in women breasts is typically associated with the presence of cancer. In this paper, a method for automatic breast contour detection is presented as a preprocessing step for microcalcification detection. Then, a combination of scalespace algorithms are used to locate candidate regions of microcalcifications and a significant percentage of false positives are finally discriminated via thresholding. Detected regions using this method have been found to describe 91.6% of microcalcifications from the MIAS database with an average specificity of 97.30%.
more …
By
Argotte, Liliana; Noguez, Julieta; Arroyo, Gustavo
Post to Citeulike
This paper shows the creation of the adaptive SCORM sequencing models, taking advantage of the latest developments offered by the artificial intelligence field, to provide the best choice to the student, based in learning objects, using a tutor model in self learning. The Tutor uses decision networks also called influence diagrams, to improve the development of resources and learning materials in a learning content management system, to offer students the best pedagogical decision according to their performance. The intelligent learning system is validated in an online environment. The results of the evaluation process in undergraduate engineering courses are encouraging because they show improvements in student’s learning who used this approach, compared to those who did not use it. The paper also shows the potential application of this learning approach for power system’s operators.
more …
By
Elías, Arturo; OchoaZezzatti, Alberto; Padilla, Alejandro; Ponce, Julio
Show all (4)
Post to Citeulike
2 Citations
Nowadays, plastic card fraud detection is of great importance to financial institutions. This paper presents a proposal for an automated credit card fraud detection system based on the outlier analysis technology. Previous research has established that the use of outlier analysis is one of the best techniques for the detection of fraud in general. However, to establish patterns to identify anomalies, these patterns are learned by the fraudsters and then they change the way to make de fraud. The approach applies a multiobjective model hybridized with particle swarm optimization of typical cardholder’s behavior and to analyze the deviation of transactions, thus finding suspicious transactions in a non supervised scheme.
more …
By
Tovar, Mireya; Cruz, Adrián; Vázquez, Blanca; Pinto, David; Vilariño, Darnes; Montes, Azucena
Show all (6)
Post to Citeulike
1 Citations
In this paper we propose two iterative clustering methods for grouping Wikipedia documents of a given huge collection into clusters. The recursive method clusters iteratively subsets of the complete collection. In each iteration, we select representative items for each group, which are then used for the next stage of clustering.
The presented approaches are scalable algorithms which may be used with huge collections that in other way (for instance, using the classic clustering methods) would be computationally expensive of being clustered. The obtained results outperformed the random baseline presented in the INEX 2010 clustering task of the XMLMining track.
more …
By
Lukin, Vladimir; Ponomarenko, Nikolay; Kurekin, Andrey; Pogrebnyak, Oleksiy
Show all (4)
Post to Citeulike
1 Citations
Several main practical tasks, important for effective preprocessing of multichannel remote sensing (RS) images, are considered in order to reliably retrieve useful information from them and to provide availability of data to potential users. First, possible strategies of data processing are discussed. It is shown that one problem is to use more adequate models to describe the noise present in real images. Another problem is automation of all or, at least, several stages of data processing, like determination of noise type and its statistical characteristics, noise filtering and image compression before applying classification at the final stage. Second, some approaches that are effective and are able to perform well enough within automatic or semiautomatic frameworks for multichannel images are described and analyzed. The applicability of the proposed methods is demonstrated for particular examples of real RS data classification.
more …
By
Hernández, Paula; Gómez, Claudia; Cruz, Laura; Ochoa, Alberto; Castillo, Norberto; Rivera, Gilberto
Show all (6)
Post to Citeulike
The computational optimization field defines the parameter tuning problem as the correct selection of the parameter values in order to stabilize the behavior of the algorithms. This paper deals the parameters tuning in dynamic and largescale conditions for an algorithm that solves the Semantic Query Routing Problem (SQRP) in peertopeer networks. In order to solve SQRP, the HH_AdaNAS algorithm is proposed, which is an ant colony algorithm that deals synchronously with two processes. The first process consists in generating a SQRP solution. The second one, on the other hand, has the goal to adjust the Time To Live parameter of each ant, through a hyperheuristic. HH_AdaNAS performs adaptive control through the hyperheuristic considering SQRP local conditions. The experimental results show that HH_AdaNAS, incorporating the techniques of parameters tuning with hyperheuristics, increases its performance by 2.42% compared with the algorithms to solve SQRP found in literature.
more …
By
Trujillo, Leonardo; Martínez, Yuliana; Melin, Patricia
Post to Citeulike
1 Citations
A fundamental task that must be addressed before classifying a set of data, is that of choosing the proper classification method. In other words, a researcher must infer which classifier will achieve the best performance on the classification problem in order to make a reasoned choice. This task is not trivial, and it is mostly resolved based on personal experience and individual preferences. This paper presents a methodological approach to produce estimators of classifier performance, based on descriptive measures of the problem data. The proposal is to use Genetic Programming (GP) to evolve mathematical operators that take as input descriptors of the problem data, and output the expected error that a particular classifier might achieve if it is used to classify the data. Experimental tests show that GP can produce accurate estimators of classifier performance, by evaluating our approach on a large set of 500 twoclass problems of multimodal data, using a neural network for classification. The results suggest that the GP approach could provide a tool that helps researchers make a reasoned decision regarding the applicability of a classifier to a particular problem.
more …
By
Dey, Debangana; Solorio, Thamar; Montes y Gómez, Manuel; Escalante, Hugo Jair
Show all (4)
Post to Citeulike
1 Citations
The paper proposes the use of the Silhouette Coefficient (SC) as a ranking measure to perform instance selection in text classification. Our selection criterion was to keep instances with midrange SC values while removing the instances with high and low SC values. We evaluated our hypothesis across three wellknown datasets and various machine learning algorithms. The results show that our method helps to achieve the best tradeoff between classification accuracy and training time.
more …
