Showing 1 to 100 of 300 matching Articles
Results per page:
Export (CSV)
By
Pinto, David; Juan, Alfons; Rosso, Paolo
Post to Citeulike
The world wide web is a natural setting for crosslingual information retrieval. The European Union is a typical example of a multilingual scenario, where multiple users have to deal with information published in at least 20 languages. Given queries in some source language and a target corpus in another language, the typical approximation consists in translating either the query or the target dataset to the other language. Other approaches use parallel corpora to obtain a statistical dictionary of words among the different languages. In this work, we propose to use a training corpus made up by a set of QueryRelevant Document Pairs (QRDP) in a probabilistic crosslingual information retrieval approach which is based on the IBM alignment model 1 for statistical machine translation. Our approach has two main advantages over those that use direct translation and parallel corpora: we will not obtain a translation of the query, but a set of associated words which share their meaning in some way and, therefore, the obtained dictionary is, in a broad sense, more semantic than a translation one. Besides, since the queries are supervised, we are working in a more restricted domain than that when using a general parallel corpus (it is well known that in this context results are better than those which are performed in a general context). In order to determine the quality of our experiments, we compared the results with those obtained by a direct translation of the queries with a query translation system, observing promising results.
more …
By
Winter, Victor; Kapur, Deepak
Post to Citeulike
Based on our investigations of a case study of controllers for train systems [6,7,13,14], we present a model of reactive systems which emphasizes dynamic partitioning of system behavior into normal and abnormal. The class of reactive systems considered are nonstrict in the sense that their behavior is not entirely governed by past events; instead, future events must also be considered in the design of controllers for such systems.
more …
By
Pinto, David; Rosso, Paolo; Jiménez, Ernesto
Post to Citeulike
This paper presents an approach of a crosslingual information retrieval which uses a ranking method based on a penalisation version of the Jaccard formula. The obtained results after the submission of a set of runs to the WebCLEF 2006 have shown that this simple ranking formula may be used in a crosslingual environment. A comparison with runs submitted by other teams ranks us in a third place by using all the topics. A fourth place is obtained with our best overall results by using only the new topic set, and a second place was got by using only the automatic topics of the new topic set. An exact comparison with the rest of the participants is in fact difficult to obtain and, therefore, we consider that further detailed analysis of the components should be done in order to determine the best components of the proposed system.
more …
By
Olague, Gustavo; Romero, Eva; Trujillo, Leonardo; Bhanu, Bir
Show all (4)
Post to Citeulike
2 Citations
This paper presents a linear genetic programming approach, that solves simultaneously the region selection and feature extraction tasks, that are applicable to common image recognition problems. The method searches for optimal regions of interest, using texture information as its feature space and classification accuracy as the fitness function. Texture is analyzed based on the gray level cooccurrence matrix and classification is carried out with a SVM committee. Results show effective performance compared with previous results using a standard image database.
more …
By
Lopez P., Juan C.; Monroy, Raúl; Hutter, Dieter
Post to Citeulike
2 Citations
Although there exist informal design guidelines and formal development support, security protocol development is timeconsuming because design is errorprone. In this paper, we introduce Shrimp, a mechanism that aims to speed up the development cycle by adding automated aid for protocol diagnosis and repair. Shrimp relies on existing verification tools both to analyse an intermediate protocol and to compute attacks if the protocol is flawed. Then it analyses such attacks to pinpoint the source of the failure and synthesises appropriate patches, using Abadi and Needham’s principles for protocol design. We have translated some of these principles into formal requirements on (sets of) protocol steps. For each requirement, there is a collection of rules that transform a set of protocol steps violating the requirement into a set conforming it. We have successfully tested our mechanism on 36 faulty protocols, getting a repair rate of 90%.
more …
By
OlveraLópez, J. Arturo; MartínezTrinidad, J. Francisco; CarrascoOchoa, J. Ariel
Post to Citeulike
3 Citations
The object selection is an important task for instancebased classifiers since through this process the size of a training set could be reduced and then the runtimes in both classification and training steps would be reduced. Several methods for object selection have been proposed but some methods discard relevant objects for the classification step. In this paper, we propose an object selection method which is based on the idea of sequential floating search. This method reconsiders the inclusion of relevant objects previously discarded. Some experimental results obtained by our method are shown and compared against some other object selection methods.
more …
By
López, Franco Rojas; JiménezSalazar, Héctor; Pinto, David
Post to Citeulike
3 Citations
Term selection process is a very necessary component for most natural language processing tasks. Although different unsupervised techniques have been proposed, the best results are obtained with a high computational cost, for instance, those based on the use of entropy. The aim of this paper is to propose an unsupervised term selection technique based on the use of a bigramenriched version of the transition point. Our approach reduces the corpus vocabulary size by using the transition point technique and, thereafter, it expands the reduced corpus with bigrams obtained from the same corpus, i.e., without external knowledge sources. This approach provides a considerable dimensionality reduction of the TREC5 collection and, also has shown to improve precision for some entropybased methods.
more …
By
Poli, Ricardo; Langdon, William B.; Clerc, Maurice; Stephens, Christopher R.
Show all (4)
Post to Citeulike
4 Citations
We propose a method to build discrete Markov chain models of continuous stochastic optimisers that can approximate them on arbitrary continuous problems to any precision. We discretise the objective function using a finite element method grid which produces corresponding distinct states in the search algorithm. Iterating the transition matrix gives precise information about the behaviour of the optimiser at each generation, including the probability of it finding the global optima or being deceived. The approach is tested on a (1 + 1)ES, a bare bones PSO and a realvalued GA . The predictions are remarkably accurate.
more …
By
Figueroa, Karina; Fredriksson, Kimmo
Post to Citeulike
4 Citations
We consider indexing and range searching in metric spaces. The best method known is AESA, in practice requiring the fewest number of distance evaluations to answer range queries. The problem with AESA is its space complexity, requiring storage for Θ(n^{2}) distance values to index n objects. We give several methods to reduce this cost. The main observation is that exact distance values are not needed, but lower and upper bounds suffice. The simplest of our methods need only Θ(n^{2}) bits (as opposed to words) of storage, but the price to pay is more distance evaluations, the exact cost depending on the dimension, as compared to AESA. To reduce this efficiency gap we extend our method to use b distance bounds, requiring
$\Theta(n^2\log_2(b))$
bits of storage. The scheme uses also Θ(b) or Θ(bn) words of auxiliary space. We experimentally show that using b ∈ {1,...,16} (depending on the problem instance) gives good results. Our preprocessing and side computation costs are the same as for AESA. We propose several improvements, achieving e.g. O(n^{1 + α}) construction cost for some 0 < α< 1, and a variant using even less space.
more …
By
Pinto, David; Benedí, JoséMiguel; Rosso, Paolo
Post to Citeulike
8 Citations
Clustering short length texts is a difficult task itself, but adding the narrow domain characteristic poses an additional challenge for current clustering methods. We addressed this problem with the use of a new measure of distance between documents which is based on the symmetric KullbackLeibler distance. Although this measure is commonly used to calculate a distance between two probability distributions, we have adapted it in order to obtain a distance value between two documents. We have carried out experiments over two different narrowdomain corpora and our findings indicates that it is possible to use this measure for the addressed problem obtaining comparable results than those which use the Jaccard similarity measure.
more …
By
Melin, Patricia; Ochoa, Valente; Valenzuela, Luis; Torres, Gabriela; Clemente, Daniel
Show all (5)
Post to Citeulike
Abstract
We describe in this paper the application of several neural network architectures to the problem of simulating and predicting the dynamic behavior of complex economic time series. We use several neural network models and training algorithms to compare the results and decide at the end, which one is best for this application. We also compare the simulation results with the traditional approach of using a statistical model. In this case, we use real time series of prices of consumer goods to test our models. Real prices of tomato and green onion in the U.S. show complex fluctuations in time and are very complicated to predict with traditional statistical approaches.
more …
By
OlveraLópez, J. Arturo; MartínezTrinidad, J. Francisco; CarrascoOchoa, J. Ariel
Post to Citeulike
1 Citations
In supervised classification, the object selection or instance selection is an important task, mainly for instancebased classifiers since through this process the time in training and classification stages could be reduced. In this work, we propose a new mixed data object selection method based on clustering and border objects. We carried out an experimental comparison between our method and other object selection methods using some mixed data classifiers.
more …
By
ToscanoPulido, Gregorio; Coello Coello, Carlos A.; SantanaQuintero, Luis Vicente
Post to Citeulike
9 Citations
This paper presents the Efficient MultiObjective Particle Swarm Optimizer (EMOPSO), which is an improved version of a multiobjective evolutionary algorithm (MOEA) previously proposed by the authors. Throughout the paper, we provide several details of the design process that led us to EMOPSO. The main issues discussed are: the mechanism to maintain a set of welldistributed nondominated solutions, the turbulence operator that avoids premature convergence, the constrainthandling scheme, and the study of parameters that led us to propose a selfadaptation mechanism. The final algorithm is able to produce reasonably good approximations of the Pareto front of problems with up to 30 decision variables, while performing only 2,000 fitness function evaluations. As far as we know, this is the lowest number of evaluations reported so far for any multiobjective particle swarm optimizer. Our results are compared with respect to the NSGAII in 12 test functions taken from the specialized literature.
more …
By
JuárezGonzález, Antonio; TéllezValero, Alberto; DeniciaCarral, Claudia; MontesyGómez, Manuel; VillaseñorPineda, Luis
Show all (5)
Post to Citeulike
This paper describes a QA system centered in a full datadriven architecture. It applies machine learning and text mining techniques to identify the most probable answers to factoid and definition questions respectively. Its major quality is that it mainly relies on the use of lexical information and avoids applying any complex language processing resources such as named entity classifiers, parsers and ontologies. Experimental results on the Spanish Question Answering task at CLEF 2006 show that the proposed architecture can be a practical solution for monolingual question answering by reaching a precision as high as 51%.
more …
By
HernándezRodríguez, Selene; MartínezTrinidad, J. Francisco; CarrascoOchoa, J. Ariel
Post to Citeulike
1 Citations
In this work, a fast k most similar neighbor (kMSN) classifier for mixed data is presented. The k nearest neighbor (kNN) classifier has been a widely used nonparametric technique in Pattern Recognition. Many fast kNN classifiers have been developed to be applied on numerical object descriptions, most of them based on metric properties to avoid object comparisons. However, in some sciences as Medicine, Geology, Sociology, etc., objects are usually described by numerical and non numerical features (mixed data). In this case, we can not assume the comparison function satisfies metric properties. Therefore, our classifier is based on search algorithms suitable for mixed data and nonmetric comparison functions. Some experiments and a comparison against other two fast kNN methods, using standard databases, are presented.
more …
By
Melin, Patricia; Gonzalez, Claudia; Bravo, Diana; Gonzalez, Felma; Martinez, Gabriela
Show all (5)
Post to Citeulike
7 Citations
Abstract
We describe in this paper a new approach for pattern recognition using modular neural networks with a fuzzy logic method for response integration. We proposed a new architecture for modular neural networks for achieving pattern recognition in the particular case of human faces and fingerprints. Also, the method for achieving response integration is based on the fuzzy Sugeno integral with some modifications. Response integration is required to combine the outputs of all the modules in the modular network. We have applied the new approach for fingerprint and face recognition with a real database from students of our institution.
more …
By
Bonev, Boyan; Escolano, Francisco; Lozano, Miguel A.; Suau, Pablo; Cazorla, Miguel A.; Aguilar, Wendy
Show all (6)
Post to Citeulike
11 Citations
In this paper, we propose a novel method for the unsupervised clustering of graphs in the context of the constellation approach to object recognition. Such method is an EM central clustering algorithm which builds prototypical graphs on the basis of fast matching with graph transformations. Our experiments, both with random graphs and in realistic situations (visual localization), show that our prototypes improve the set median graphs and also the prototypes derived from our previous incremental method. We also discuss how the method scales with a growing number of images.
more …
By
Collazos, Cesar A.; González, María Paula; Neyem, Andrés; Sturm, Christian
Show all (4)
Post to Citeulike
Interpersonal communication involves more than just words; it involves emotional issues that can be roughly seen as complex organized internal states. Awareness of those states allows human beings to evaluate social information and develop strategic social intelligence. In this setting, developing emotional awareness devices can be successfully achieved under a Cultural Centred Design perspective, as social and cultural features are crucial to ensure an adequate level of emotional awareness. However, culturaloriented recommendations are not always included to lead the promoting of an adequate emotional awareness in digital and physical devices. To cope with this problem, this paper presents a minimal set of cultural guidelines that should be taken into account to develop emotional awareness devices under Cultural Centred Design. To illustrate the proposal, the development of an extended virtual portrait is discussed by highlighting a cultural viewpoint form a LatinAmerican perspective.
more …
By
Hliněný, Petr; Salazar, Gelasio
Post to Citeulike
3 Citations
CrossingNumber is one of the most challenging algorithmic problems in topological graph theory, with applications to graph drawing and VLSI layout. No polynomial time constant approximation algorithm is known for this NPcomplete problem. We prove that a natural approach to planar drawing of toroidal graphs (used already by Pach and Tóth in [21]) gives a polynomial time constant approximation algorithm for the crossing number of toroidal graphs with bounded degree. In this proof we present a new “grid” theorem on toroidal graphs.
more …
By
DelgadoMata, Carlos; Aylett, Ruth S.
Post to Citeulike
The paper investigates the role of an affective system as part of an ethologicallyinspired actionselection mechanism for virtual animals in a 3D interactive graphics environment. It discusses the integration of emotion with flocking and grazing behaviours and a mechanism for communicating emotion between animals; develops a metric for analyzing the collective behaviour of the animals and its complexity and shows that emotion reduces the complexity of behaviour and thus mediates between individual and collective behaviour.
more …
By
Tchernykh, A.; CristóbalSalas, A.; Kober, V.; Ovseevich, I. A.
Show all (4)
Post to Citeulike
In this paper, a partial evaluation technique to reduce communication costs of distributed image processing is presented. It combines application of incomplete structures and partial evaluation together with classical program optimization such as constantpropagation, loop unrolling and deadcode elimination. Through a detailed performance analysis, we establish conditions under which the technique is beneficial.
more …
By
Cruz Reyes, Laura; NietoYáñez, Diana M.; RangelValdez, Nelson; Herrera Ortiz, Juan A.; González B, J.; Castilla Valdez, Guadalupe; DelgadoOrta, J. Francisco
Show all (7)
Post to Citeulike
2 Citations
The present paper approaches the loading distribution of trucks for Product Transportation as a rich problem. This is formulated with the classic Bin Packing Problem and five variants associated with a real case of study. A state of the art review reveals that related work deals with three variants at the most. Besides, they do not consider its relation with the vehicle routing problem. For the solution of this new rich problem a heuristicdeterministic algorithm was developed. It works together with a metaheuristic algorithm to assign routes and loads. The results of solving a set of real world instances show an average saving of three vehicles regarding their manual solution; this last needed 180 minutes in order to solve an instance and the actual methodology takes two minutes. On average, the demand was satisfied in 97.45%. As future work the use of a non deterministic algorithm is intended.
more …
By
Chakraborty, Debrup; Pal, Nikhil R.
Post to Citeulike
Typically the response of a multilayered perceptron (MLP) network on points which are far away from the boundary of its training data is not very reliable. When test data points are far away from the boundary of its training data, the network should not make any decision on these points. We propose a training scheme for MLPs which tries to achieve this. Our methodology trains a composite network consisting of two subnetworks : a mapping network and a vigilance network. The mapping network learns the usual inputoutput relation present in the data and the vigilance network learns a decision boundary and decides on which points the mapping network should respond. Though here we propose the methodology for multilayered perceptrons, the philosophy is quite general and can be used with other learning machines also.
more …
By
Gelbukh, Alexander; Sidorov, Grigori; ChanonaHernandez, Liliana
Post to Citeulike
In this paper, we present an optimization algorithm for finding the best text alignment based on the lexical similarity and the results of its evaluation as compared with baseline methods (Gale and Church, relative position). For evaluation, we use fiction texts that represent nontrivial cases of alignment. Also, we present a new method for evaluation of the algorithms of parallel texts alignment, which consists in restoration of the structure of the text in one of the languages using the units of the lower level and the available structure of the text in the other language. For example, in case of paragraph level alignment, the sentences are used to constitute the restored paragraphs. The advantage of this method is that it does not depend on corpus data.
more …
By
Aguilar, Wendy; MartinezPerez, M. Elena; Frauel, Yann; Escolano, Francisco; Lozano, Miguel Angel; EspinosaRomero, Arturo
Show all (6)
Post to Citeulike
7 Citations
In this paper, we propose a highly robust pointmatching method (Graph Transformation Matching  GTM) relying on finding the consensus graph emerging from putative matches. Such method is a twophased one in the sense that after finding the consensus graph it tries to complete it as much as possible. We successfully apply GTM to image registration in the context of finding mosaics from retinal images. Feature points are obtained after properly segmenting such images. In addition, we also introduce a novel topological descriptor for quantifying disease by characterizing the arterial/venular trees. Such descriptor relies on diffusion kernels on graphs. Our experiments have showed only statistical significance for the case of arterial trees, which is consistent with previous findings.
more …
By
Li, Kang; Peng, JianXun; Fei, Minrui; Li, Xiaoou; Yu, Wen
Show all (5)
Post to Citeulike
This paper investigates the construction of a wide class of singlehidden layer neural networks (SLNNs) with or without tunable parameters in the hidden nodes. It is a challenging problem if both the parameter training and determination of network size are considered simultaneously. Two alternative network construction methods are considered in this paper. Firstly, the discrete construction of SLNNs is introduced. The main objective is to select a subset of hidden nodes from a pool of candidates with parameters fixed ‘a priori’. This is called discrete construction since there are no parameters in the hidden nodes that need to be trained. The second approach is called continuous construction as all the adjustable network parameters are trained on the whole parameter space along the network construction process. In the second approach, there is no need to generate a pool of candidates, and the network grows one by one with the adjustable parameters optimized. The main contribution of this paper is to show that the network construction can be done using the above two alternative approaches, and these two approaches can be integrated within a unified analytic framework, leading to potentially significantly improved model performance and/or computational efficiency.
more …
By
Pérez, Cynthia Beatriz; Olague, Gustavo
Post to Citeulike
2 Citations
This work describes an evolutionary approach to texture segmentation, a longstanding and important problem in computer vision. The difficulty of the problem can be related to the fact that real world textures are complex to model and analyze. In this way, segmenting texture images is hard to achieve due to irregular regions found in textures. We present our EvoSeg algorithm, which uses knowledge derived from texture analysis to identify how many homogeneous regions exist in the scene without a priori information. EvoSeg uses texture features derived from the Gray Level Cooccurrence Matrix and optimizes a fitness measure, based on the minimum variance criteria, using a hierarchical GA. We present qualitative results by applying EvoSeg on synthetic and real world images and compare it with the stateoftheart JSEG algorithm.
more …
By
Carrera, Gerardo; Savage, Jesus; MayolCuevas, Walterio
Post to Citeulike
1 Citations
This paper presents a robust implementation of an object tracker able to tolerate partial occlusions, rotation and scale for a variety of different objects. The objects are represented by collections of interest points which are described in a multiresolution framework, giving a representation of those points at different scales. Inspired by [1], a stack of descriptors is built only the first time that the interest points are detected and extracted from the region of interest. This provides efficiency of representation and results in faster tracking due to the fact that it can be done offline. An Unscented Kalman Filter (UKF) using a constant velocity model estimates the position and the scale of the object, with the uncertainty in the position and the scale obtained by the UKF, the search of the object can be constrained only in a specific region in both the image and in scale.
The use of this approach shows an improvement in realtime tracking and in the ability to recover from full occlusions.
more …
By
Rojas, Franco; JiménezSalazar, Héctor; Pinto, David
Post to Citeulike
Nowadays, crosslingual Information Retrieval (IR) is one of the greatest challenges to deal with. Besides, one of the most important issues in IR consists of the corpus vocabulary reduction. In real situations some methods of IR such as the wellknown vector space model, it is necessary to reduce the term space. In this work, we have considered a vocabulary reduction process based on the selection of midfrequency terms. Our approach enhances precision, but in order to obtain a better recall, we have conducted an enrichment process based on the addition of coocurrence terms. By using this approach, we have obtained an improvement of 40%, using the BiEnEs topics of the WebCLEF 2005 task. The obtained results in the current mixed monolingual task of the WebCLEF 2006 have shown that the text enrichment must be done before the vocabulary reduction process in order to get the best performance.
more …
By
Gomez, Juan Carlos; Fuentes, Olac
Post to Citeulike
The hybridization of optimization techniques can exploit the strengths of different approaches and avoid their weaknesses. In this work we present a hybrid optimization algorithm based on the combination of Evolution Strategies (ES) and Locally Weighted Linear Regression (LWLR). In this hybrid a local algorithm (LWLR) proposes a new solution that is used by a global algorithm (ES) to produce new better solutions. This new hybrid is applied in solving an interesting and difficult problem in astronomy, the twodimensional fitting of brightness profiles in galaxy images.
The use of standardized fitting functions is arguably the most powerful method for measuring the largescale features (e.g. brightness distribution) and structure of galaxies, specifying parameters that can provide insight into the formation and evolution of galaxies. Here we employ the hybrid algorithm ES+LWLR to find models that describe the bidimensional brightness profiles for a set of optical galactic images. Models are created using two functions: de Vaucoleurs and exponential, which produce models that are expressed as sets of concentric generalized ellipses that represent the brightness profiles of the images.
The problem can be seen as an optimization problem because we need to minimize the difference between the flux from the model and the flux from the original optical image, following a normalized Euclidean distance. We solved this optimization problem using our hybrid algorithm ES+LWLR. We have obtained results for a set of 100 galaxies, showing that hybrid algorithm is very well suited to solve this problem.
more …
By
Saarikoski, Harri M. T.; Legrand, Steve; Gelbukh, Alexander
Post to Citeulike
1 Citations
We present a novel method for improving disambiguation accuracy by building an optimal ensemble (OE) of systems where we predict the best available system for target word using a priori case factors (e.g. amount of training per sense). We report promising results of a series of bestsystem prediction tests (best prediction accuracy is 0.92) and show that complex/simple systems disambiguate tough/easy words better. The method provides the following benefits: (1) higher disambiguation accuracy for virtually any base systems (current best OE yields close to 2% accuracy gain over Senseval3 state of the art) and (2) economical way of building more effective ensembles of all types (e.g. optimal, weighted voting and crossvalidation based). The method is also highly scalable in that it utilizes readily available factors available for any ambiguous word in any language for estimating word difficulty and defines classifier complexity using known properties only.
more …
By
ViteSilva, Israel; CruzCortés, Nareli; ToscanoPulido, Gregorio; Fraga, Luis Gerardo
Show all (4)
Post to Citeulike
4 Citations
The triangulation is a process by which the 3D point position can be calculated from two images where that point is visible. This process requires the intersection of two known lines in the space. However, in the presence of noise this intersection does not occur, then it is necessary to estimate the best approximation. One option towards achieving this goal is the usage of evolutionary algorithms. In general, evolutionary algorithms are very robust optimization techniques, however in some cases, they could have some troubles finding the global optimum getting trapped in a local optimum. To overcome this situation some authors suggested removing the local optima in the search space by means of a singleobjective problem to a multiobjective transformation. This process is called multiobjectivization. In this paper we successfully apply this multiobjectivization to the triangulation problem.
more …
By
Castillo, Oscar; Huesca, Gabriel; Valdez, Fevrier
Post to Citeulike
11 Citations
Abstract
We describe in this paper the use of hierarchical genetic algorithms for fuzzy system optimization in intelligent control. In particular, we consider the problem of optimizing the number of rules and membership functions using an evolutionary approach. The hierarchical genetic algorithm enables the optimization of the fuzzy system design for a particular application. We illustrate the approach with the case of intelligent control in a medical application. Simulation results for this application show that we are able to find an optimal set of rules and membership functions for the fuzzy system.
more …
By
AvilésLópez, Edgardo; GarcíaMacías, J. Antonio
Post to Citeulike
4 Citations
The computing grid no longer encompasses only traditional computers to perform coordinated tasks, as also lowend devices are now considered active members of the envisioned pervasive grid. Wireless sensor networks play an important role in this vision, since they provide the means for gathering vast amounts of data from physical phenomena. However, the current integration of wireless sensor networks and the grid is still primitive; one important aspect in this integration is providing higherlevel abstractions for the development of applications, since accessing the data from wireless sensor networks currently implies dealing with very lowlevel constructs. We propose TinySOA, a serviceoriented architecture that allows programmers to access wireless sensor networks from their applications by using a simple serviceoriented API via the language of their choice. We show an implementation of TinySOA and some sample applications developed with it that exemplify how easy grid applications can integrate sensor networks.
more …
By
HernándezRodríguez, Selene; MartínezTrinidad, J. Francisco; CarrascoOchoa, J. Ariel
Post to Citeulike
The nearest neighbor (NN) classifier has been a widely used technique in pattern recognition because of its simplicity and good behavior. To decide the class of a new object, the NN classifier performs an exhaustive comparison between the object to classify and the training set T. However, when T is large, the exhaustive comparison is very expensive and sometimes becomes inapplicable. To avoid this problem, many fast NN algorithms have been developed for numerical object descriptions, most of them based on metric properties to avoid comparisons. However, in some sciences as Medicine, Geology, Sociology, etc., objects are usually described by numerical and non numerical attributes (mixed data). In this case, we can not assume the comparison function satisfies metric properties. Therefore, in this paper a fast most similar object classifier based on search methods suitable for mixed data is presented. Some experiments using standard databases and a comparison with other two fast NN methods are presented.
more …
By
Pinto, David; Rosso, Paolo
Post to Citeulike
Clustering is often considered the most important unsupervised learning problem and several clustering algorithms have been proposed over the years. Many of these algorithms have been tested on classical clustering corpora such as Reuters and 20 Newsgroups in order to determine their quality. However, up to now the relative hardness of those corpora has not been determined. The relative clustering hardness of a given corpus may be of high interest, since it would help to determine whether the usual corpora used to benchmark the clustering algorithms are hard enough. Moreover, if it is possible to find a set of features involved in the hardness of the clustering task itself, specific clustering techniques may be used instead of general ones in order to improve the quality of the obtained clusters. In this paper, we are presenting a study of the specific feature of the vocabulary overlapping among documents of a given corpus. Our preliminary experiments were carried out on three different corpora: the train and test version of the R8 subset of the Reuters collection and a reduced version of the 20 Newsgroups (Mini20Newsgroups). We figured out that a possible relation between the vocabulary overlapping and the FMeasure may be introduced.
more …
By
PérezCoutiño, Manuel; MontesyGómez, Manuel; LópezLópez, Aurelio; VillaseñorPineda, Luis; PancardoRodríguez, Aarón
Show all (5)
Post to Citeulike
1 Citations
This paper describes the experiments performed for the QA@CLEF2006 within the joint participation of the eLing Division at VEng and the Language Technologies Laboratory at INAOE. The aim of these experiments was to observe and quantify the improvements in the final step of the Question Answering prototype when some syntactic features were included into the decision process. In order to reach this goal, a shallow approach to answer ranking based on the term density measure has been integrated into the weighting schema. This approach has shown an interesting improvement against the same prototype without this module. The paper discusses the results achieved, the conclusions and further directions within this research.
more …
By
Trujillo, Leonardo; Olague, Gustavo
Post to Citeulike
1 Citations
This work presents scale invariant region detectors that apply evolved operators to extract an interest measure. We evaluate operators using their repeatability rate, and have experimentally identified a plateau of local optima within a space of possible interest operators Ω. The space Ω contains operators constructed with Gaussian derivatives and standard arithmetic operations. From this set of local extrema, we have chosen two operators, obtained by searching within Ω using Genetic Programming, that are optimized for high repeatability and global separability when imaging conditions are modified by a known transformation. Then, by embedding the operators into the linear scale space generated with a Gaussian kernel we can characterize scale invariant features by detecting extrema within the scale space response of each operator. Our scale invariant region detectors exhibit a high performance when compared with stateoftheart techniques on standard tests.
more …
By
AcevesPérez, Rita Marina; MontesyGómez, Manuel; VillaseñorPineda, Luis
Post to Citeulike
3 Citations
One major problem of stateoftheart Cross Language Question Answering systems is the translation of user questions. This paper proposes combining the potential of multiple translation machines in order to improve the final answering precision. In particular, it presents three different methods for this purpose. The first one focuses on selecting the most fluent translation from a given set; the second one combines the passages recovered by several question translations; finally, the third one constructs a new question reformulation by merging word sequences from different translations. Experimental results demonstrated that the proposed approaches allow reducing the error rates in relation to a monolingual question answering exercise.
more …
By
RománGodínez, Israel; YáñezMárquez, Cornelio
Post to Citeulike
1 Citations
Most heteroassociative memories models intend to achieve the recall of the entire trained pattern. The AlphaBeta associative memories only ensure the correct recall of the trained patterns in autoassociative memories, but not for the heteroassociative memories. In this work we present a new algorithm based on the AlphaBeta Heteroassociative memories that allows, besides correct recall of some altered patterns, perfect recall of all the trained patterns, without ambiguity. The theoretical support and some experimental results are presented.
more …
By
Levner, Eugene; Pinto, David; Rosso, Paolo; Alcaide, David; Sharma, R. R. K.
Show all (5)
Post to Citeulike
1 Citations
Among various document clustering algorithms that have been proposed so far, the most useful are those that automatically reveal the number of clusters and assign each target document to exactly one cluster. However, in many real situations, there not exists an exact boundary between different clusters. In this work, we introduce a fuzzy version of the MajorClust algorithm. The proposed clustering method assigns documents to more than one category by taking into account a membership function for both, edges and nodes of the corresponding underlying graph. Thus, the clustering problem is formulated in terms of weighted fuzzy graphs. The fuzzy approach permits to decrease some negative effects which appear in clustering of largesized corpora with noisy data.
more …
By
AcevesPérez, Rita M.; MontesyGómez, Manuel; VillaseñorPineda, Luis
Post to Citeulike
One major problem in multilingual Question Answering (QA) is the combination of answers obtained from different languages into one single ranked list. This paper proposes a new method for tackling this problem. This method is founded on a graphbased ranking approach inspired in the popular Google’s PageRank algorithm. Experimental results demonstrate that the proposed method outperforms other current techniques for answer fusion, and also evidence the advantages of multilingual QA over the traditional monolingual approach.
more …
By
Escobar, Ivan; Vilches, Erika; Vallejo, Edgar E.; Cody, Martin L.; Taylor, Charles E.
Show all (5)
Post to Citeulike
In this paper, we explore the emergence of acoustic categories in sensor arrays. We describe a series experiments on the automatic categorization of species and individual birds using selforganizing maps. Experimental results showed that meaningful acoustic categories can arise as selforganizing processes in sensor arrays. In addition, we discuss how distributed categorization could be used for the emergence of symbolic communication in these platforms.
more …
By
GalvánLópez, Edgar; RodríguezVázquez, Katya
Post to Citeulike
2 Citations
This paper describes Multiple Interactive Outputs in a Single Tree (MIOST), a new form of Genetic Programming (GP). Our approach is based on two ideas. Firstly, we have taken inspiration from graphGP representations. With this idea we decided to explore the possibility of representing programs as graphs with oriented links. Secondly, our individuals could have more than one output. This idea was inspired on the divide and conquer principle, a program is decomposed in subprograms, and so, we are expecting to make the original problem easier by breaking down a problem into two or more subproblems. To verify the effectiveness of our approach, we have used several evolvable hardware problems of different complexity taken from the literature. Our results indicate that our approach has a better overall performance in terms of consistency to reach feasible solutions.
Kerwords: Multiple Interactive Outputs in a Single Tree, Genetic Programming, GraphGP representations.
more …
By
Hliněný, Petr; Salazar, Gelasio
Post to Citeulike
6 Citations
Crossing minimization is one of the most challenging algorithmic problems in topological graph theory, with strong ties to graph drawing applications. Despite a long history of intensive research, no practical “good” algorithm for crossing minimization is known (that is hardly surprising, since the problem itself is NPcomplete). Even more surprising is how little we know about a seemingly simple particular problem: to minimize the number of crossings in an almost planar graph, that is, a graph with an edge whose removal leaves a planar graph. This problem is in turn a building block in an “edge insertion” heuristic for crossing minimization. In this paper we prove a constant factor approximation algorithm for the crossing number of almost planar graphs with bounded degree. On the other hand, we demonstrate nontriviality of the crossing minimization problem on almost planar graphs by exhibiting several examples, among them new families of crossing critical graphs which are almost planar and projective.
2000 Math Subject Classification: 05C10, 05C62, 68R10.
more …
By
Astudillo, Leslie; Castillo, Oscar; Aguilar, Luis T.; Martínez, Ricardo
Show all (4)
Post to Citeulike
8 Citations
This paper focuses on the control of wheeled mobile robot under bounded torque disturbances. Hybrid tracking controller for the mobile robot was developed by considering its kinematic model and EulerLagrange dynamics. The procedure consist in minimizing the stabilization error of the kinematic model through genetic algorithm approach while attenuation to perturbed torques is made through type2 Fuzzy Logic Control (FLC) via backstepping methodology. Type2 fuzzy logic is proposed to synthesize the controller for the overall system which is claimed to be a robust tool for related applications. The theoretical results are illustrated through computer simulations of the closedloop system.
more …
By
Kober, Vitaly; Agis, Jacobo Gomez
Post to Citeulike
2 Citations
A local adaptive restoration technique using a sliding discrete cosine transform (DCT) is presented. A minimum meansquare error estimator in the domain of a sliding DCT for image restoration is derived. The local restoration is performed by pointwise modification of local DCT coefficients. To provide image processing in real time, a fast recursive algorithm for computing the sliding DCT is utilized. The algorithm is based on a recursive relationship between three subsequent local DCT spectra. Computer simulation results using a real image are provided and compared with that of common restoration techniques.
more …
By
Jesus Rubio, Jose; Yu, Wen
Post to Citeulike
5 Citations
Compared to normal learning algorithms, for example backpropagation, the optimal bounded ellipsoid (OBE) algorithm has some better properties, such as faster convergence, since it has a similar structure as Kalman filter. OBE has some advantages over Kalman filter training, the noise is not required to be Guassian. In this paper OBE algorithm is applied traing the weights of recurrent neural networks for nonlinear system identification. Both hidden layers and output layers can be updated. From a dynamic systems point of view, such training can be useful for all neural network applications requiring realtime updating of the weights. A simple simulation gives the effectiveness of the suggested algorithm.
more …
By
BayroCorrochano, Eduardo; Trujillo, Noel; Naranjo, Michel
Post to Citeulike
28 Citations
This paper presents an application of the quaternion Fourier transform for the preprocessing for neuralcomputing. In a new way the 1D acoustic signals of French spoken words are represented as 2D signals in the frequency and time domain. These kind of images are then convolved in the quaternion Fourier domain with a quaternion Gabor filter for the extraction of features. This approach allows to greatly reduce the dimension of the feature vector. Two methods of feature extraction are tested. The features vectors were used for the training of a simple MLP, a TDNN and a system of neural experts. The improvement in the classification rate of the neural network classifiers are very encouraging which amply justify the preprocessing in the quaternion frequency domain. This work also suggests the application of the quaternion Fourier transform for other image processing tasks.
more …
By
Pérez O, Joaquín; Pazos R, Rodolfo; Cruz R, Laura; Reyes S, Gerardo; Basave T, Rosy; Fraire H, Héctor
Show all (6)
Post to Citeulike
6 Citations
Clustering problems arise in many different applications: machine learning, data mining, knowledge discovery, data compression, vector quantization, pattern recognition and pattern classification. One of the most popular and widely studied clustering methods is Kmeans. Several improvements to the standard Kmeans algorithm have been carried out, most of them related to the initial parameter values. In contrast, this article proposes an improvement using a new convergence condition that consists of stopping the execution when a local optimum is found or no more object exchanges among groups can be performed. For assessing the improvement attained, the modified algorithm (Early Stop Kmeans) was tested on six databases of the UCI repository, and the results were compared against SPSS, Weka and the standard Kmeans algorithm. Experimentally Early Stop Kmeans obtained important reductions in the number of iterations and improvements in the solution quality with respect to the other algorithms.
more …
By
Hernández, Igmar; García, Paola; Nolazco, Juan; Buera, Luis; Lleida, Eduardo
Show all (5)
Post to Citeulike
This work presents a robust normalization technique by cascading a speech enhancement method followed by a feature vector normalization algorithm. To provide speech enhancement the Spectral Subtraction (SS) algorithm is used; this method reduces the effect of additive noise by performing a subtraction of the noise spectrum estimate over the complete speech spectrum. On the other hand, an empirical feature vector normalization technique known as PDMEMLIN (PhonemeDependent MultiEnviroment Models based LInear Normalization) has also shown to be effective. PDMEMLIN models clean and noisy spaces employing Gaussian Mixture Models (GMMs), and estimates a set of linear compensation transformations to be used to clean the signal. The proper integration of both approaches is studied and the final design, PDMEEMLIN (PhonemeDependent MultiEnviroment Enhanced Models based LInear Normalization), confirms and improves the effectiveness of both approaches. The results obtained show that in very high degraded speech PDMEEMLIN outperforms the SS by a range between 11.4% and 34.5%, and for PDMEMLIN by a range between 11.7% and 24.84%. Furthemore, in moderate SNR, i.e. 15 or 20 dB, PDMEEMLIN is as good as PDMEMLIN and SS techniques.
more …
By
Heredia, V. M.; Urrutia, J.
Post to Citeulike
Let P_{n} be a set of n points on the plane in general position, n ≥ 4. A convex quadrangulation of P_{n} is a partitioning of the convex hull
$\mathit{Conv}(P_n)$
of P_{n} into a set of quadrilaterals such that their vertices are elements of P_{n}, and no element of P_{n} lies in the interior of any quadrilateral. It is straightforward to see that if P admits a quadrilaterization, its convex hull must have an even number of vertices. In [6] it was proved that if the convex hull of P_{n} has an even number of points, then by adding at most
$\frac{3n}{2}$
Steiner points in the interior of its convex hull, we can always obtain a point set that admits a convex quadrangulation. The authors also show that
$\frac{n}{4}$
Steiner points are sometimes necessary. In this paper we show how to improve the upper and lower bounds of [6] to
$\frac{4n}{5}+2$
and to
$\frac{n}{3}$
respectively. In fact, in this paper we prove an upper bound of n, and with a long and unenlightening case analysis (over fifty cases!) we can improve the upper bound to
$\frac{4n}{5}+2$
, for details see [9].
more …
By
Morán L., Luis E.; PintoElías, Raúl
Post to Citeulike
In this work we assume a frontal view of a face for the lips shape extraction, then the first step is locate a face inside a digital image, for this task we use techniques based in color to extract only the pixels with skin tone, a templates based in integral projections are applied to verify and locate the face, using integral projections, we locate and define a region of interest for lips. Previously a statistical model of lips (ASM) was created in the same way, local appearance patterns of landmarks are modeled using Local Binary Patterns (LBP) , in this model we try to capture a variation from a closed lips to an opened lips. For the search task Local Binary Pattern Histogram (LBPH) are used.
more …
By
Martínez, Mauricio; Vallejo, Edgar E.; Morett, Enrique
Post to Citeulike
This paper explores the capabilities of genetic algorithms for reconstructing ancestral DNA sequences. We conducted a series of experiments on reconstructing ancestral states from a given collection of taxa and their phylogenetic relationships. We tested the proposed model using simulated phylogenies obtained from actual DNA sequences by applying realistic mutation rates. Experimental results demonstrated that the recursive application of genetic algorithms to smaller instances of the problem allows us to reconstruct ancestral DNA states accurately.
more …
By
Fraga, Luis Gerardo; Silva, Israel Vite; CruzCortés, Nareli
Post to Citeulike
We use a genetic algorithm to solve the problem, widely treated in the specialized literature, of fitting an ellipse to a set of given points. Our proposal uses as the objective function the minimization of the sum of orthogonal Euclidean distances from the given points to the curve; this is a nonlinear problem which is usually solved using the minimization of the quadratic distances that allows to use the gradient and the numerical methods based on it, such as GaussNewton. The novelty of the proposed approach is that as we are using a GA, our algorithm does not need initialization, and uses the Euclidean distance as the objective function. We will also show that in our experiments, we are able to obtain better results than those previously reported. Additionally our solutions have a very low variance, which indicates the robustness of our approach.
more …
By
Sun, Chaofan; Vilalta, Ricardo
Post to Citeulike
1 Citations
This paper presents a data preprocessing procedure to select support vector (SV) candidates. We select decision boundary region vectors (BRVs) as SV candidates. Without the need to use the decision boundary, BRVs can be selected based on a vector’s nearest neighbor of opposite class (NNO). To speed up the process, two spatial approximation sample hierarchical (SASH) trees are used for estimating the BRVs. Empirical results show that our data selection procedure can reduce a full dataset to the number of SVs or only slightly higher. Training with the selected subset gives performance comparable to that of the full dataset. For large datasets, overall time spent in selecting and training on the smaller dataset is significantly lower than the time used in training on the full dataset.
more …
By
MarinCastro, Heidy; Sucar, Enrique; Morales, Eduardo
Post to Citeulike
2 Citations
Automatic image annotation consists on automatically labeling images, or image regions, with a predefined set of keywords, which are regarded as descriptors of the highlevel semantics of the image. In supervised learning, a set of previously annotated images is required to train a classifier. Annotating a large quantity of images by hand is a tedious and time consuming process; so an alternative approach is to label manually a small subset of images, using the other ones under a semisupervised approach. In this paper, a new semisupervised ensemble of classifiers, called WSA, for automatic image annotation is proposed. WSA uses naive Bayes as its base classifier. A set of these is combined in a cascade based on the AdaBoost technique. However, when training the ensemble of Bayesian classifiers, it also considers the unlabeled images on each stage. These are annotated based on the classifier from the previous stage, and then used to train the next classifier. The unlabeled instances are weighted according to a confidence measure based on their predicted probability value; while the labeled instances are weighted according to the classifier error, as in standard AdaBoost. WSA has been evaluated with benchmark data sets, and 2 sets of images, with promising results.
more …
By
LópezEspinoza, Erika Danaé; AltamiranoRobles, Leopoldo
Post to Citeulike
In this paper, the deterministic component of 2D Wold decomposition is used to obtain texture descriptors in industrial plastic quality images, and hidden geometry of tree crown in remote sensing images. The texture image is decomposed into two texture images: a nondeterministic texture and a deterministic one. In order to obtain texture descriptors, a set of discriminant texture features is selected from the deterministic component. The texture descriptors have been used to distinguish among three kinds of plastic quality. The obtained texture descriptors are compared against texture descriptors obtained from the original image. With the objective to find hidden geometry of tree crown in remote sensing images, the deterministic component of the original image is analyzed. The observed geometry is compared against the modeled geometry in the literature of marked point processes.
more …
By
Aragón, Victoria S.; Esquivel, Susana C.; Coello Coello, Carlos A.
Post to Citeulike
4 Citations
In this paper, we present a novel model of an artificial immune system (AIS), based on the process that suffers the TCell. The proposed model is used for solving constrained (numerical) optimization problems. The model operates on three populations: Virgins, Effectors and Memory. Each of them has a different role. Also, the model dynamically adapts the tolerance factor in order to improve the exploration capabilities of the algorithm. We also develop a new mutation operator which incorporates knowledge of the problem. We validate our proposed approach with a set of test functions taken from the specialized literature and we compare our results with respect to Stochastic Ranking (which is an approach representative of the stateoftheart in the area) and with respect to an AIS previously proposed.
more …
By
Zamarripa, Myrna S.; Gonzalez, Victor M.; Favela, Jesus
Post to Citeulike
2 Citations
Even with the introduction of computer technology, paperbased artifacts remain ubiquitous in hospital settings. The need to manually transfer and update information from the physical to the digital realm is a common practice among hospital staff, which, although usually well managed, at times becomes a source for errors and inconsistencies. This paper presents an augmented patient chart system that preserves the use of paper and allows capturing information directly to the system through the use of a digital pen. An evaluation of the system with 22 volunteers indicates a significant reduction in the number of errors while reading information, a significant increment on the accuracy while annotating data, as well as a trend towards less time spent while annotating data on the digital paper. Based on our results, we argue that the design preserves the advantages associated with paper while increasing the availability of information and its trustworthiness.
more …
By
Gómez, Octavio; González, Jesús A.; Morales, Eduardo F.
Post to Citeulike
6 Citations
Segmentation through seeded region growing is widely used because it is fast, robust and free of tuning parameters. However, the seeded region growing algorithm requires an automatic seed generator, and has problems to label unconnected pixels (the unconnected pixel problem). This paper introduces a new automatic seeded region growing algorithm called ASRGIB1 that performs the segmentation of color (RGB) and multispectral images. The seeds are automatically generated via histogram analysis; the histogram of each band is analyzed to obtain intervals of representative pixel values. An image pixel is considered a seed if its gray values for each band fall in some representative interval. After that, our new seeded region growing algorithm is applied to segment the image. This algorithm uses instancebased learning as distance criteria. Finally, according to the user needs, the regions are merged using ownership tables. The algorithm was tested on several leukemia medical images showing good results.
more …
By
Fatto, Vincenzo; Laurini, Robert; Lopez, Karla; Loreto, Rosalva; MilleretRaffort, Françoise; Sebillo, Monica; SolMartinez, David; Vitiello, Giuliana
Show all (8)
Post to Citeulike
1 Citations
Chorems are schematized representations of territories, and so they can represent a good visual summary of spatial databases. Indeed for spatial decisionmakers, it is more important to identify and map problems than facts. Until now, chorems were made manually by geographers based on the own knowledge of the territory. So, an international project was launched in order to automatically discover spatial patterns and layout chorems starting from spatial databases. After examining some manuallymade chorems some guidelines were identified. Then the architecture of a prototype system is presented based on a canonical database structure, a subsystem for spatial patterns discovery based on spatial data mining, a subsystem for chorem layout, and a specialized language to represent chorems.
more …
By
Woodward, Alexander; Delmas, Patrice; Gimel’farb, Georgy; Marquez, Jorge
Show all (4)
Post to Citeulike
A complete system for creating the performance of a virtual character is described. Stereo webcameras perform marker based motion capture to obtain rigid head motion and nonrigid facial expression motion. Acquired 3D points are then mapped onto a 3D face model with a virtual muscle animation to create face expressions. Muscle inverse kinematics updates muscle contraction parameters based on marker motion to create the character’s performance. Advantages of the system are reduced character creation time by using virtual muscles and a dynamic skin model, a novel way of applying markers to a face animation system, and its low cost hardware requirements, capable of running on standard hardware and making it suitable for interactive media in enduser environments.
more …
By
Lara, Carlos; Romero, Leonardo
Post to Citeulike
This paper deals with the problem of finding the movement of a mobile robot given two consecutive laser scans. The proposed method extracts a line map from the sequence of points in each laser scan, using a probabilistic approach, and then computes virtual corners between two lines in the same line map. The movement of the robot is estimated from correspondences of virtual corners between the two line maps. The combination of the probabilistic approach to find lines and the reduced number of virtual corners are the key ideas to get a simple, fast, robust to outliers, and reliable method to solve the local localization problem.
more …
By
GómezGil, Pilar; De los SantosTorres, Guillermo; NavarreteGarcía, Jorge; RamírezCortés, Manuel
Show all (4)
Post to Citeulike
Abstract
The need for accessing information through the web and other kind of distributed media makes it mandatory to convert almost every kind of document to a digital representation. However, there are many documents that were created long time ago and currently, in the best cases, only scanned images of them are available, when a digital transcription of their content is needed. For such reason, libraries across the world are looking for automatic OCR systems able to transcript that kind of documents. In this chapter we describe how Artificial Neural Networks can be useful in the design of an Optical Character Recognizer able to transcript handwritten and printed old documents. The properties of Neural Networks allow this OCR to have the ability to adapt to the styles of handwritten or antique fonts. Advances with two prototype parts of such OCR are presented.
more …
By
Li, Xiaoou; Li, Kang
Post to Citeulike
RNA sequences detection is timeconsuming because of its huge data set size. Although SVM has been proved to be useful, normal SVM is not suitable for classification of large data sets because of its high training complexity. A twostage SVM classification approach is introduced for fast classifying large data sets. Experimental results on several RNA sequences detection demonstrate that the proposed approach is promising for such applications.
more …
By
Gómez, Octavio; Morales, Eduardo F.; González, Jesús A.
Post to Citeulike
Instancebased learning algorithms are widely used due to their capacity to approximate complex target functions; however, the performance of this kind of algorithms degrades significantly in the presence of irrelevant features. This paper introduces a new noise tolerant instancebased learning algorithm, called WIBK, that uses one or more weights, per feature per class, to classify integervalued databases. A set of intervals that represent the rank of values of all the features is automatically created for each class, and the nonrepresentative intervals are discarded. The remaining intervals (representative intervals) of each feature are compared against the representative intervals of the same feature in the other classes to assign a weight. The weight represents the discriminative power of the interval, and is used in the similarity function to improve the classification accuracy. The algorithm was tested on several datasets, and compared against other representative machine learning algorithms showing very competitive results.
more …
By
Chávez, Edgar; Mitton, Nathalie; Tejeda, Héctor
Post to Citeulike
9 Citations
Sensor networks are wireless adhoc networks where all the nodes cooperate for routing messages in the absence of a fixed infrastructure. Nonflooding, guaranteed delivery routing protocols are preferred because sensor networks have limited battery life. Location aware routing protocols are good candidates for sensor network applications, nevertheless they need either an external location service like GPS or Galileo (which are bulky, energy consuming devices) or internal location services providing nonunique virtual coordinates leading to low delivery rates. In this paper we introduce Position Trees a collision free, distributed labeling algorithm based on hop counting, which embed a spanning tree of the underlying network . The Routing with Position Trees (RTP) is a guaranteed delivery, nonflooding, efficient implicit routing protocol based on Position Trees. We study experimentally the statistical properties of memory requirements and the routing efficiency of the RPT.
more …
By
Mata, Felix
Post to Citeulike
7 Citations
Geographic Information Science community is recognized that modern Geographic Information Retrieval systems should support the processing of imprecise data distributed over heterogeneous repositories. This means the search for relevant geographic results for a geographic query (Q_{G}) even if the data sources do not contain a result that matches exactly the user’s request and then approximated results would be useful. Therefore, GIR systems should be centred at the nature and essence of spatial data (their relations and properties) taken into consideration the user’s profile. Usually, semantic features are implicitly presented in different data sources. In this work, we use three heterogeneous data sources: vector data, geographic ontology, and geographic dictionaries. These repositories usually store topological relations, concepts, and descriptions of geographical objects under certain scenarios. In contrast to previous work, where these layers have been treated in an isolated way, their integration expects to be a better solution to capture the semantics of spatial objects. Thus, the use of spatial semantics and the integration of different information layers improve GIR, because adequate retrieval parameters according to the nature of spatial data, which emulate the user’s requirements, can be established. In particular, we use topological relations {inside, in}, semantic relations {hyperonimy, meronimy}, and descriptions {constraints, representation}. An information extraction mechanism is designed for each data source, while the integration process is performed using the algorithm of ontology exploration. The ranking process is based on similarity measures, using the previously developed confusion theory. Finally, we present a case study to show some results of integrated GIR (iGIR) and compare them with Google’s ones in a tabular form.
more …
By
Vázquez, Roberto A.; Sossa, Humberto; Garro, Beatriz A.
Post to Citeulike
3 Citations
In this paper we propose a viewbased method for 3D object recognition based on some biological aspects of infant vision. The biological hypotheses of this method are based on the role of the response to low frequencies at early stages, and some conjectures concerning how an infant detects subtle features (stimulating points) from an object. In order to recognize an object from different images of it (different orientations from 0° to 100°) we make use of a dynamic associative memory (DAM). As the infant vision responds to low frequencies of the signal, a lowfilter is first used to remove high frequency components from the image. Then we detect subtle features in the image by means of a random feature selection detector. At last, the DAM is fed with this information for training and recognition. To test the accuracy of the proposal we use the Columbia Object Image Library (COIL 100) database.
more …
By
Klimov, A. B.; Romero, J. L.; Björk, G.; SánchezSoto, L. L.
Show all (4)
Post to Citeulike
1 Citations
We propose a unifying phasespace approach to the construction of mutually unbiased bases for an nqubit system. It is based on an explicit classification of the geometrical structures compatible with the notion of unbiasedness. These consist of bundles of discrete curves intersecting only at the origin and satisfying certain additional conditions. The effect of local transformations is also studied.
more …
By
Tentori, Monica; Favela, Jesus
Post to Citeulike
3 Citations
Highly mobile hospital workers experiment intense and adhoc collaboration during their everyday practices. This has motivated the introduction of collaborative applications enhanced with ubiquitous technology in hospitals. However, an environment filled with many different systems augmented with a wide range of functionality, introduces an extra burden for hospital workers in selecting the services and information that are adequate to the task at a hand. ActivityBased Computing (ABC) has emerged as a new interaction paradigm in support of these problems. In this paper, we empower the vision of ABC with a degree of consciousness about the physical changing context towards the design of activityaware applications. Based on workplace studies conducted in a hospital, we established a set of design principles for the development of activityaware applications. To exemplify the design principles proposed, we designed and implemented an activityaware map that personalizes the information shown to hospital workers, enforces availability and sends collaboration warnings.
more …
By
FraustoSolis, Juan; Román, E. F.; Romero, David; Soberon, Xavier; LiñánGarcía, Ernesto
Show all (5)
Post to Citeulike
3 Citations
In this paper a Simulated Annealing algorithm (SA) for solving the Protein Folding Problem (PFP) is presented. This algorithm has two phases: quenching and annealing. The first phase is applied at very high temperatures and the annealing phase is applied at high and low temperatures. The temperature during the quenching phase is decreased by an exponential function. We run through an efficient analytical method to tune the algorithm parameters. This method allows the change of the temperature in accordance with solution quality, which can save large amounts of execution time for PFP.
more …
By
MorenoIbarra, Marco
Post to Citeulike
2 Citations
The paper presents an approach to verifying the consistency of generalized geospatial data at a conceptual level. The principal stages of the proposed methodology are Analysis, Synthesis, and Verification. Analysis is focused on extracting the peculiarities of spatial relations by means of quantitative measures. Synthesis is used to generate a conceptual representation (ontology) that explicitly and qualitatively represents the relations between geospatial objects, resulting in tuples called herein semantic descriptions. Verification consists of a comparison between two semantic descriptions (description of source and generalized data): we measure the semantic distance (confusion) between ontology local concepts, generating three global concepts Equal, Unequal, and Equivalent. They measure the (in) consistency of generalized data: Equal and Equivalent – their consistency, while Unequal – an inconsistency. The method does not depend on coordinates, scales, units of measure, cartographic projection, representation format, geometric primitives, and so on. The approach is applied and tested on the generalization of two topographic layers: rivers and elevation contour lines (case of study).
more …
By
Barrón, Ricardo; Sossa, Humberto; Cruz, Benjamín
Post to Citeulike
In this work we present an algorithm for training an associative memory based on the socalled multilayered morphological perceptron with maximal support neighborhoods. We compare the proposal with the original one by performing some experiments with real images. We show the superiority of the new one. We also give formal conditions for correct classification. We show that the proposal can be applied to the case of graylevel images and not only binary images.
more …
By
Hayet, JeanBernard; Piater, Justus
Post to Citeulike
3 Citations
This article proposes a global approach to the rectification of sport sequences, to estimate the mapping from the video images to the terrain in the ground plane without using position sensors on the TV camera. Our strategy relies on three complementary techniques: (1) initial homography estimation using linefeature matching, (2) homography estimation with linefeature tracking, and (3) incremental homography estimation through pointfeature tracking. Together, they allow continuous homography estimation over time, even during periods where the video does not contain sufficient line features to determine the homography from scratch. We illustrate the complementarity of the 3 techniques on a set of challenging examples.
more …
By
HernándezGracidas, Carlos; Sucar, L. Enrique
Post to Citeulike
5 Citations
Contentbased image retrieval (CBIR) is currently limited because of the lack of representational power of the lowlevel image features, which fail to properly represent the actual contents of an image, and consequently poor results are achieved with the use of this sole information. Spatial relations represent a class of highlevel image features which can improve image annotation. We apply spatial relations to automatic image annotation, a task which is usually a first step towards CBIR. We follow a probabilistic approach to represent different types of spatial relations to improve the automatic annotations which are obtained based on lowlevel features. Different configurations and subsets of the computed spatial relations were used to perform experiments on a database of landscape images. Results show a noticeable improvement of almost 9% compared to the base results obtained using the kNearest Neighbor classifier.
more …
By
Sossa, Humberto; Barrón, Ricardo; Vázquez, Roberto A.
Post to Citeulike
1 Citations
In this paper we study how the performance of a median associative memory is influenced when the values of its elements are altered by noise. To our knowledge this kind of research has not been reported until know. We give formal conditions under which the memory is still able to correctly recall a pattern of the fundamental set of patterns either from a nonaltered or a noisy version of it. Experiments are also given to show the efficiency of the proposal.
more …
By
González B., J. Javier; Pazos R., Rodolfo A.; Gelbukh, Alexander; Sidorov, Grigori; Fraire H., Hector; Cruz C., I. Cristina
Show all (6)
Post to Citeulike
This paper present the treatment of prepositions and conjunctions in natural language interfaces to databases (NLIDB) that allows better translation of queries expressed in natural language into formal languages. Prepositions and conjunctions weren’t sufficiently studied for their usage in NLIDBs, because most of the NLIDBs just look for keywords in the sentences and focus their analysis on nouns and verbs getting rid of auxiliary words in the query. This paper shows that prepositions and conjunctions can be represented as operations using formal set theory. Additionally, since prepositions and conjunctions keep their meaning in any context, their treatment is domain independent. In our experiments we used Spanish language. We validate our approach using two databases; Northwind and Pubs of SQL Server, with a corpus of 198 different queries for the first one and 70 queries for the second one. The 84% of queries were translated correctly for the database Northwind and 80% for Pubs.
more …
By
Vega, Gerardo
Post to Citeulike
8 Citations
We use techniques from linear recurring sequences, exponential sums and Gaussian sums, in order to present a set of characterizations for the oneweight irreducible cyclic codes over finite fields. Without using such techniques, a subset of these characterizations was already presented in [2]. By means of this new set of characterizations, we give an explicit expression for the number of oneweight cyclic codes, when the length and dimension are given.
more …
By
Torres, Miguel; Levachkine, Serguei; Moreno, Marco; Quintero, Rolando; Guzmán, Giovanni
Show all (5)
Post to Citeulike
Many types of information are geographically referenced and interactive maps provide a natural user interface to such data. However, the process to access and retrieve geospatial data presents several problems related to heterogeneity and interoperability of the geospatial information. Thus, information integration and semantic heterogeneity are not trivial tasks. Therefore, we propose a webmapping system focused on retrieving geospatial information by means of geospatial ontologies and representing this information on the Internet. Moreover, a MultiAgent System is proposed to deal with the process related to obtain the tourist geoinformation, which aids in the informationintegration task for several nodes (geographic sites) that are involved in this application. The agent system provides the mechanism to communicate different distributed and heterogeneous Geographic Information Systems and retrieves the data by means of GML description. Also, this paper proposes an interoperability approach based on geospatial ontologies matching that is performed by the MultiAgent System in each node considered in the application. The retrieval mechanism is based on encoding the information in a GML description to link each geospatial data with a concept of the ontologies that have been proposed.
more …
By
RosalesSilva, Alberto; Ponomaryov, Volodymyr I.; GallegosFunes, Francisco J.
Post to Citeulike
1 Citations
We propose a fuzzy logic recursive scheme using directional processing for motion detection and spatialtemporal filtering to decrease Gaussian noise corruption. We introduce novel ideas that employ the differences between images. That permits to connect these using angle deviations in them obtaining several parameters and applying them in the robust algorithm that is capable to detect and differentiate movement in background of noise in any way.
more …
By
RamírezManzanares, Alonso; Rivera, Mariano; Kornprobst, Pierre; Lauze, François
Show all (4)
Post to Citeulike
2 Citations
We propose a variational approach for multivalued velocity field estimation in transparent sequences. Starting from existing local motion estimators, we show a variational model for integrating in space and time these local estimations to obtain a robust estimation of the multivalued velocity field. With this approach, we can indeed estimate some multivalued velocity fields which are not necessarily piecewise constant on a layer: Each layer can evolve according to nonparametric optical flow. We show how our approach outperforms some existing approaches, and we illustrate its capabilities on several challenging synthetic/real sequences.
more …
By
Rojas, Alejandro; Cumplido, René; CarrascoOchoa, J. Ariel; Feregrino, Claudia; MartínezTrinidad, J. Francisco
Show all (5)
Post to Citeulike
3 Citations
Irreducible testors (also named typical testors) are a useful tool for feature selection in supervised classification problems with mixed incomplete data. However, the complexity of computing all irreducible testors of a training matrix has an exponential growth with respect to the number of columns in the matrix. For this reason different approaches like heuristic algorithms, parallel and distributed processing, have been developed. In this paper, we present the design and implementation of a custom architecture for BT algorithm, which allows computing testors from a given input matrix. The architectural design is based on a parallel approach that is suitable for high populated input matrixes. The architecture has been designed to deal with parallel processing of all matrix rows, automatic candidate generation, and can be configured for any size of matrix. The architecture is able to evaluate whether a feature subset is a testor of the matrix and to calculate the next candidate to be evaluated, in a single clock cycle. The architecture has been implemented on a Field Programmable Gate Array (FPGA) device. Results show that it provides significant performance improvements over a previously reported hardware implementation. Implementation results are presented and discussed.
more …
By
TorresHuitzil, Cesar; Girau, Bernard; Gauffriau, Adrien
Post to Citeulike
3 Citations
The performance of configurable digital circuits such as Field Programmable Gate Arrays (FPGA) increases at a very fast rate. Their finegrain parallelism shows great similarities with connectionist models. This is the motivation for numerous works of neural network implementations on FPGAs, targeting applications such as autonomous robotics, ambulatory medical systems, etc. Nevertheless, such implementations are performed with an ASPC (ApplicationSpecific Programmable Circuits) approach that requires a strong hardware expertise. In this paper a highlevel design framework for FPGAbased implementations of neural networks from high level specifications is presented but the final goal of the project is a hardware/software codesign environment for embedded implementations of most classical neural topologies. Such a framework aims at providing the connectionist community with efficient automatic FPGA implementations of their models without any advanced knowledge of hardware. A current developed software platform, NNetWAREBuilder, handles multilayer feedforward and graphicallydesigned neural networks and automatically compiles them onto FPGA devices with third party synthesis tools. The internal representation of a neural model is bound to commonly used hardware computing units in a library to create the hardware model. Experimental results are presented to evaluate design and implementation tradeoffs.
more …
By
Psenicka, B.; Bello, R. Bustamante; Rodriguez, M. A.
Post to Citeulike
4 Citations
This paper presents a general matrix algorithm for analysis and synthesis of digital filters. A useful method for computing the statespace matrix of a general digital network and a new technique for the design of digital filters are shown by means of examples. The method proposed in this paper allows the analysis of the digital filters and the construction of new equivalent structures of the canonic and non canonic digital filter forms. Equivalent filters with different structures can be found according to various matrix expansions. The procedure proposed in this paper is more efficient and economic than traditional methods because it permits to construct circuits with a minimum of shifting operations.
more …
By
Montiel, Oscar; Castillo, Oscar; Melin, Patricia; Sepúlveda, Roberto
Show all (4)
Post to Citeulike
Abstract
There exists no standard method for obtaining a nonlinear inputoutput model using external dynamic approach. In this work, we are using an evolutionary optimization method for estimating the parameters of an NFIR model using the Wiener model structure. Specifically we are using a Breeder Genetic Algorithm (BGA) with fuzzy recombination for performing the optimization work. We selected the BGA since it uses real parameters (it does not require any string coding), which can be manipulated directly by the recombination and mutation operators. For training the system we used amplitude modulated pseudo random binary signal (APRBS). The adaptive system was tested using sinusoidal signals.
more …
By
AldapePérez, Mario; YáñezMárquez, Cornelio; ArgüellesCruz, Amadeo José
Post to Citeulike
2 Citations
Performance in most pattern classifiers is improved when redundant or irrelevant features are removed, however, this is mainly achieved by high demanding computational methods or successive classifiers construction. This paper shows how Associative Memories can be used to get a mask value which represents a subset of features that clearly identifies irrelevant or redundant information for classification purposes, therefore, classification accuracy is improved while significant computational costs in the learning phase are reduced. An optimal subset of features allows register size optimization, which contributes not only to significant power savings but to a smaller amount of synthesized logic, furthermore, improved hardware architectures are achieved due to functional units size reduction, as a result, it is possible to implement parallel and cascade schemes for pattern classifiers on the same ASIC.
more …
By
MejíaLavalle, Manuel; Morales, Eduardo F.; Arroyo, Gustavo
Post to Citeulike
2 Citations
We present two feature selection methods, inspired in the Shannon’s entropy and the Information Gain measures, that are easy to implement. These methods apply when we have a database with continuous attributes and discrete multi class. The first method applies when attributes are independent among them given the class. The second method is useful when we suspect that interdependencies among the attributes exist. In the experiments that we realized, with synthetic and real databases, the proposed methods are shown to be fast and to produce near optimum solutions, with a good feature reduction ratio.
more …
By
Schütze, Oliver; Coello Coello, Carlos A.; Talbi, ElGhazali
Post to Citeulike
4 Citations
In this paper we develop a framework for the approximation of the entire set of εefficient solutions of a multiobjective optimization problem with stochastic search algorithms. For this, we propose the set of interest, investigate its topology and state a convergence result for a generic stochastic search algorithm toward this set of interest. Finally, we present some numerical results indicating the practicability of the novel approach.
more …
By
Aguilar, Raúl A.; Antonio, Angélica; Imbert, Ricardo
Post to Citeulike
In the teamwork research area there is an increasing interest about the principles behind team effectiveness and effective team training; for Intelligent Virtual Agents (IVAs) Team Training is an excellent application area; nevertheless, the few reported works about IVAs in team training, illustrate both the use for the individualized teaching (Pedagogical Agents) of procedural tasks and the substitution of missing team members (Teammate Agents) to promote the practice of team tasks in relation to functional roles (Taskwork) [1].
Our interest on Intelligent Virtual Environments for Training (IVETs) has led us to propose a Team Training Strategy (TTS) whose purpose is to promote social skills as well as knowledge and skills related to tasks of socio technical nature. The alternatives that we are evaluated to improve the performance of human groups and to promote effective teams deal with: the use of scaffolding as the best tutoring approach, the promotion of social skills before technical skills, and especially, the selection of the best nonfunctional roles (team roles) balance according to the task.
In addition, our aim is to incorporate into an IVA called Pancho (Pedagogical AgeNt to support Colaborative Human grOups) the particular behaviors of Team Roles defined by Belbin [2]; Pancho, with a selected team role —according to a team model— will join the human group with the intention of improving the performance of the team (Teamwork) and providing scaffolding to the trainees (Taskwork). The Belbin’s categorization is the earliest and still the most popular. He states that the team role can be defined as a tendency to behave, contribute and interrelate with each others at work in a certain distinctive ways; he also states that in teamwork, a good mix of team roles in the group is necessary for groups to use their technical skills optimally. The team roles defined by Belbin have very particular behaviors; we have selected a generic cognitive architecture for agents with emotionally influenced behaviors —called COGNITIVA— to realize those roles [3]. The constructs provided by this architecture (Personal traits, Concerns, Moods, Attitudes and Physical states) are being properly instantiated to generate the desired behaviors.
more …
By
Vallejo, Edgar E.; Cody, Martin L.; Taylor, Charles E.
Post to Citeulike
4 Citations
In this paper, we propose the application of hierarchical selforganizing maps to the unsupervised acoustic classification of bird species. We describe a series of experiments on the automated categorization of tropical antbirds from their songs. Experimental results showed that accurate classification can be achieved using the proposed model. In addition, we discuss how categorization capabilities could be deployed in sensor arrays.
more …
By
Serrato Paniagua, Ramiro; Flores Romero, Juan J.; Coello Coello, Carlos A.
Post to Citeulike
In this work we design a genetic representation and its genetic operators to encode individuals for evolving Dynamic System Models in a Qualitative Differential Equation form, for System Identification. The representation proposed, can be implemented in almost every programming language without the need of complex data structures, this representation gives us the possibility to encode an individual whose phenotype is a Qualitative Differential Equation in QSIM representation. The Evolutionary Computation paradigm we propose for evolving structures like those found in the QSIM representation, is a variation of Genetic Programming called Gene Expression Programming. Our proposal represents an important variation in the multigene chromosome structure of Gene Expression Programming at the level of the gene codification structure. This gives us an efficient way of evolving QSIM Qualitative Differential Equations and the basis of an Evolutionary Computation approach to Qualitative System Identification.
more …
By
Ortiz, Floriberto; Yu, Wen; MorenoArmendariz, Marco; Li, Xiaoou
Show all (4)
Post to Citeulike
Normal fuzzy CMAC neural network performs well because of its fast learning speed and local generalization capability for approximating nonlinear functions. However, it requires huge memory and the dimension increases exponentially with the number of inputs. In this paper, we use recurrent technique to overcome these problems and propose a new CMAC neural network, named recurrent fuzzy CMAC (RFCMAC). Since the structure of RFCMAC is more complex, normal training methods are difficult to be applied. A new simple algorithm with a timevarying learning rate is proposed to assure the learning algorithm is stable.
more …
By
Vazquez, Roberto A.; Sossa, Humberto; Garro, Beatriz A.
Post to Citeulike
Several associative memories (AM) have been proposed in the last years. These AMs have several constraints that limit their applicability in complex problems such as face recognition. Despite of the power of these models, they cannot reach its full power without applying new mechanisms based on current and future studies on biological neural networks. In this research we show how a network of dynamic associative memories (DAM) combined with some aspects of the infant vision system could be efficiently applied to the face recognition problem. Through several experiments by using a benchmark of faces the accuracy of the proposal is tested.
more …
By
Lasserre, Jean B.; Zeron, Eduardo S.
Post to Citeulike
1 Citations
Given z ∈ ℂ^{n} and A ∈ ℤ^{m×n}, we provide an explicit expression and an algorithm for evaluating the counting function h(y;z): = ∑ { z^{x}  x ∈ ℤ^{n};Ax=y,x ≥ 0}. The algorithm only involves simple (but possibly numerous) calculations. In addition, we exhibit finitely many fixed convex cones of ℝ^{n} explicitly and exclusively defined by A, such that for anyy ∈ ℤ^{m}, h(y;z) is obtained by a simple formula that evaluates ∑ z^{x} over the integral points of those cones only. At last, we also provide an alternative (and different) formula from a decomposition of the generating function into simpler rational fractions, easy to invert.
more …
