Showing 1 to 100 of 314 matching Articles
Results per page:
Export (CSV)
By
MancillasLópez, Cuauhtemoc; Chakraborty, Debrup; RodríguezHenríquez, Francisco
Tweakable enciphering schemes are a certain type of blockcipher mode of operation which provide security in the sense of a strong pseudorandom permutation. It has been proposed that these types of modes are suitable for inplace disk encryption. Currently there are many proposals available for these schemes. EME is one of the efficient candidate of this category. EME2 is a derivative of EME which is currently one of the candidates of a draft standard for wide block modes by the IEEE working group on storage security. We show some weakness of these two modes assuming that some side channel information is available.
more …
By
Beuchat, JeanLuc; Detrey, Jérémie; Estibals, Nicolas; Okamoto, Eiji; RodríguezHenríquez, Francisco
Show all (5)
5 Citations
This paper is devoted to the design of fast parallel accelerators for the cryptographic Tate pairing in characteristic three over supersingular elliptic curves. We propose here a novel hardware implementation of Miller’s loop based on a pipelined KaratsubaOfman multiplier. Thanks to a careful selection of algorithms for computing the tower field arithmetic associated to the Tate pairing, we manage to keep the pipeline busy. We also describe the strategies we considered to design our parallel multiplier. They are included in a VHDL code generator allowing for the exploration of a wide range of operators. Then, we outline the architecture of a coprocessor for the Tate pairing over
$\mathbb{F}_{3^m}$
. However, a final exponentiation is still needed to obtain a unique value, which is desirable in most of the cryptographic protocols. We supplement our pairing accelerator with a coprocessor responsible for this task. An improved exponentiation algorithm allows us to save hardware resources.
According to our placeandroute results on Xilinx FPGAs, our design improves both the computation time and the areatime tradeoff compared to previoulsy published coprocessors.
more …
By
Mata, Felix
1 Citations
Geographic Information Ranking consists of measuring if a document (answer) is relevant to a spatial query. It is done by comparing characteristics in common between document and query. The most popular approaches compare just one aspect of geographical data (geographic properties, topology, among others). It limits the assessment of document relevance. Nevertheless, it can be improved when key characteristics of geographical objects are considered in the ranking (1) geographical attributes, (2) topological relations, and (3) geographical concepts. In this paper, we outline iRank a method that integrates these three aspects to rank a document. Ourapproach evaluates documents from three sources of information: GeoOntologies, dictionaries, and topology files. Relevance is measured according to three stages. In the first stage, the relevance is computed by processing concepts; in second stage relevance is calculated using geographic attributes. In the last stage, the relevance is measured by computing topologic relations. Thus, the main contribution of iRank is show that integration of three ranking criteria is better than when they are used in separate way.
more …
By
Monroy, Raúl; Bundy, Alan; Green, Ian
1 Citations
Unique Fixpoint Induction (UFI) is the chief inference rule to prove the equivalence of recursive processes in the Calculus of Communicating Systems (CCS) (Milner 1989). It plays a major role in the equational approach to verification. Equational verification is of special interest as it offers theoretical advantages in the analysis of systems that communicate values, have infinite state space or show parameterised behaviour. We call these kinds of systems VIPSs. VIPSs is the acronym of Valuepassing, InfiniteState and Parameterised Systems. Automating the application of UFI in the context of VIPSs has been neglected. This is both because many VIPSs are given in terms of recursive function symbols, making it necessary to carefully apply induction rules other than UFI, and because proving that one VIPS process constitutes a fixpoint of another involves computing a process substitution, mapping states of one process to states of the other, that often is not obvious. Hence, VIPS verification is usually turned into equation solving (Lin 1995a). Existing tools for this proof task, such as VPAM (Lin 1993), are highly interactive. We introduce a method that automates the use of UFI. The method uses middleout reasoning (Bundy et al. 1990a) and, so, is able to apply the rule even without elaborating the details of the application. The method introduces metavariables to represent those bits of the processes’ state space that, at application time, were not known, hence, changing from equation verification to equation solving. Adding this method to the equation plan developed by Monroy et al. (Autom Softw Eng 7(3):263–304, 2000a), we have implemented an automatic verification planner. This planner increases the number of verification problems that can be dealt with fully automatically, thus improving upon the current degree of automation in the field.
more …
By
Esponda, Fernando; Forrest, Stephanie; Helman, Paul
14 Citations
In a negative representation, a set of elements (the positive representation) is depicted by its complement set. That is, the elements in the positive representation are not explicitly stored, and those in the negative representation are. The concept, feasibility, and properties of negative representations are explored in the paper; in particular, its potential to address privacy concerns. It is shown that a positive representation consisting of n lbit strings can be represented negatively using only O(ln) strings, through the use of an additional symbol. It is also shown that membership queries for the positive representation can be processed against the negative representation in time no worse than linear in its size, while reconstructing the original positive set from its negative representation is an
$${\mathcal{NP}}$$
hard problem. The paper introduces algorithms for constructing negative representations as well as operations for updating and maintaining them.
more …
By
VillatoroTello, Esaú; VillaseñorPineda, Luis; MontesyGómez, Manuel
1 Citations
Recent evaluation results from Geographic Information Retrieval (GIR) indicate that current information retrieval methods are effective to retrieve relevant documents for geographic queries, but they have severe difficulties to generate a pertinent ranking of them. Motivated by these results in this paper we present a novel reranking method, which employs information obtained through a relevance feedback process to perform a ranking refinement. Performed experiments show that the proposed method allows to improve the generated ranking from a traditional IR machine, as well as results from traditional reranking strategies such as query expansion via relevance feedback.
more …
By
Durillo, Juan J.; GarcíaNieto, José; Nebro, Antonio J.; Coello, Carlos A. Coello; Luna, Francisco; Alba, Enrique
Show all (6)
57 Citations
Particle Swarm Optimization (PSO) has received increasing attention in the optimization research community since its first appearance in the mid1990s. Regarding multiobjective optimization, a considerable number of algorithms based on MultiObjective Particle Swarm Optimizers (MOPSOs) can be found in the specialized literature. Unfortunately, no experimental comparisons have been made in order to clarify which MOPSO version shows the best performance. In this paper, we use a benchmark composed of three wellknown problem families (ZDT, DTLZ, and WFG) with the aim of analyzing the search capabilities of six representative stateoftheart MOPSOs, namely, NSPSO, SigmaMOPSO, OMOPSO, AMOPSO, MOPSOpd, and CLMOPSO. We additionally propose a new MOPSO algorithm, called SMPSO, characterized by including a velocity constraint mechanism, obtaining promising results where the rest perform inadequately.
more …
By
López Jaimes, Antonio; Coello, Carlos A. Coello; Urías Barrientos, Jesús E.
25 Citations
In this paper, we propose and analyze two schemes to integrate an objective reduction technique into a multiobjective evolutionary algorithm (moea) in order to cope with manyobjective problems. One scheme reduces periodically the number objectives during the search until the required objective subset size has been reached and, towards the end of the search, the original objective set is used again. The second approach is a more conservative scheme that alternately uses the reduced and the entire set of objectives to carry out the search. Besides improving computational efficiency by removing some objectives, the experimental results showed that both objective reduction schemes also considerably improve the convergence of a moea in manyobjective problems.
more …
By
Hutter, Dieter; Monroy, Raúl
1 Citations
Security protocols are crucial to achieve trusted computing. However, designing security protocols is not easy and so security protocols are typically faulty and have to be repaired. Continuing previous work we present first steps to automate this repairing process, especially for protocols that are susceptible to typeflaw attacks. To this end, we extend the notion of strand spaces by introducing an implementation layer for messages and extending the capabilities of a penetrator to swap messages that share the same implementation. Based on this framework we are able to track type flaw attacks to incompatibilities between the way messages are implemented and the design of concrete security protocols. Heuristics are given to either change the implementation or the protocol to avoid these situations.
more …
By
Cleofas, Laura; Valdovinos, Rosa Maria; García, Vicente; Alejo, Roberto
Show all (4)
3 Citations
In realworld applications, it has been observed that class imbalance (significant differences in class prior probabilities) may produce an important deterioration of the classifier performance, in particular with patterns belonging to the less represented classes. One method to tackle this problem consists to resample the original training set, either by oversampling the minority class and/or undersampling the majority class. In this paper, we propose two ensemble models (using a modular neural network and the nearest neighbor rule) trained on datasets undersampled with genetic algorithms. Experiments with real datasets demonstrate the effectiveness of the methodology here proposed.
more …
By
Dijkman, Remco; Dumas, Marlon; GarcíaBañuelos, Luciano
139 Citations
We investigate the problem of ranking all process models in a repository according to their similarity with respect to a given process model. We focus specifically on the application of graph matching algorithms to this similarity search problem. Since the corresponding graph matching problem is NPcomplete, we seek to find a compromise between computational complexity and quality of the computed ranking. Using a repository of 100 process models, we evaluate four graph matching algorithms, ranging from a greedy one to a relatively exhaustive one. The results show that the mean average precision obtained by a fast greedy algorithm is close to that obtained with the most exhaustive algorithm.
more …
By
Mejía, David A.; Favela, Jesús; Morán, Alberto L.
Hospital workers need information to decide on the appropriate course of action for patient care; this information could be obtained from artifacts such as medical records and lab results or as a result of interactions with others. However, these exchanges could be a source of medical errors since this information is not usually preserved and could be lost –totally or partially due to the volatility of human memory. This happens due to the verbal nature of the interaction or due to the lack of an infrastructure that facilitates the capture of information even when hospital workers are on the move. The capabilities increasingly found in Smartphones, such as WiFi, touch screen or a Dpad (directed pad), builtin camera, accelerometers, contact management software, among others, make it feasible to record significant information about the interactions that take place in the hospital and seamlessly retrieve it to support work activities. Thus, in this paper we propose a system to capture and manage collaboration outcomes in hospitals through the implementation of mobile collaboration spheres in Smartphones.
more …
By
Alba, Alfonso; ArceSantana, Edgar
1 Citations
In this paper, we propose a new theoretical framework, which is based on phasecorrelation, for efficiently solving the correspondence problem. The proposed method allows area matching algorithms to perform at high frame rates, and can be applied to various problems in computer vision. In particular, we demonstrate the advantages of this method in the estimation of dense disparity maps in real time. A fairly optimized version of the proposed algorithm, implemented on a dualcore PC architecture, is capable of running at 100 frames per second with an image size of 256 ×256.
more …
By
Czyzowicz, J.; Kranakis, E.; Krizanc, D.; Lambadaris, I.; Narayanan, L.; Opatrny, J.; Stacho, L.; Urrutia, J.; Yazdani, M.
Show all (9)
28 Citations
We consider n mobile sensors located on a line containing a barrier represented by a finite line segment. Sensors form a wireless sensor network and are able to move within the line. An intruder traversing the barrier can be detected only when it is within the sensing range of at least one sensor. The sensor network establishes barrier coverage of the segment if no intruder can penetrate the barrier from any direction in the plane without being detected. Starting from arbitrary initial positions of sensors on the line we are interested in finding final positions of sensors that establish barrier coverage and minimize the maximum distance traversed by any sensor. We distinguish several variants of the problem, based on (a) whether or not the sensors have identical ranges, (b) whether or not complete coverage is possible and (c) in the case when complete coverage is impossible, whether or not the maximal coverage is required to be contiguous. For the case of n sensors with identical range, when complete coverage is impossible, we give linear time optimal algorithms that achieve maximal coverage, both for the contiguous and noncontiguous case. When complete coverage is possible, we give an O(n^{2}) algorithm for an optimal solution, a linear time approximation scheme with approximation factor 2, and a (1 + ε) PTAS. When the sensors have unequal ranges we show that a variation of the problem is NPcomplete and identify some instances which can be solved with our algorithms for sensors with unequal ranges.
more …
By
BarrónCedeño, Alberto; Sierra, Gerardo; Drouin, Patrick; Ananiadou, Sophia
Show all (4)
13 Citations
The
$C\mbox{}value/NC\mbox{}value$
algorithm, a hybrid approach to automatic term recognition, has been originally developed to extract multiword term candidates from specialised documents written in English. Here, we present three main modifications to this algorithm that affect how the obtained output is refined. The first modification aims to maximise the number of real terms in the list of candidates with a new approach for the stoplist application process. The second modification adapts the
$C\mbox{}value$
calculation formula in order to consider single word terms. The third modification changes how the term candidates are grouped, exploiting a lemmatised version of the input corpus. Additionally, size of candidate’s context window is variable. We also show the necessary linguistic modifications to apply this algorithm to the recognition of term candidates in Spanish.
more …
By
Garcia, I.; Pacheco, C.; Garcia, W.
In the last few years, Educational Software has developed enormously, but a large part of this has been badly organized and poorly documented. Recent advances in the software technology can promote the cooperative learning that is a teaching strategy in which small teams, each composed by students of different levels of ability, use different learning activities to improve their understanding of a subject. How can we design Educational Software if we never learnt how to do it? This paper describes how the Technological University of the Mixtec Region is using a cooperative application to improve the quality of education offered to its students in the Educational Software design.
more …
By
EscobarAcevedo, Adelina; MontesyGómez, Manuel; VillaseñorPineda, Luis
Crosslanguage text classification (CLTC) aims to take advantage of existing training data from one language to construct a classifier for another language. In addition to the expected translation issues, CLTC is also complicated by the cultural distance between both languages, which causes that documents belonging to the same category concern very different topics. This paper proposes a reclassification method which purpose is to reduce the errors caused by this phenomenon by considering information from the own target language documents. Experimental results in a news corpus considering three pairs of languages and four categories demonstrated the appropriateness of the proposed method, which could improve the initial classification accuracy by up to 11%.
more …
By
Herskovic, Valeria; Mejía, David A.; Favela, Jesús; Morán, Alberto L.; Ochoa, Sergio F.; Pino, José A.
Show all (6)
1 Citations
The critical nature of some working environments, such as hospitals or search and rescue operations, gives rise to the need for timely collaboration. However, interactions are not always possible since potential collaborators may be unreachable because of the lack of a communication channel to carry out the interaction or due to their involvement in other activities. The use of adequate interaction facilitators may allow users to collaborate even in these circumstances. This paper presents a characterization of this type of situation and then introduces a set of design suggestions that may help improve opportunities for user interaction in time critical mobile collaborative settings.
more …
By
Chavoya, Arturo
3 Citations
Artificial Development is a field of Evolutionary Computation inspired by the developmental processes and cellular growth seen in nature. Multiple models of artificial development have been proposed in the past, which can be broadly divided into those based on biochemical processes and those based on a high level grammar. Two of the most important aspects to consider when designing a cellular growth model are the type of representation used to specify the final features of the system, and the abstraction level necessary to capture the properties to be modeled. Although advances in this field have been significant, there is much knowledge to be gained before a model that approaches the level of complexity found in living organisms can be built.
more …
By
Monroy, Alfredo; Calvo, Hiram; Gelbukh, Alexander
6 Citations
Previous work has shown that modeling relationships between articles of a regulation as vertices of a graph network works twice as better than traditional information retrieval systems for returning articles relevant to the question. In this work we experiment by using natural language techniques such as lemmatizing and using manual and automatic thesauri for improving question based document retrieval. For the construction of the graph, we follow the approach of representing the set of all the articles as a graph; the question is split in two parts, and each of them is added as part of the graph. Then several paths are constructed from part A of the question to part B, so that the shortest path contains the relevant articles to the question. We evaluate our method comparing the answers given by a traditional information retrieval system—vector space model adjusted for article retrieval, instead of document retrieval—and the answers to 21 questions given manually by the general lawyer of the National Polytechnic Institute, based on 25 different regulations (academy regulation, scholarships regulation, postgraduate studies regulation, etc.); with the answer of our system based on the same set of regulations. We found that lemmatizing increases performance in around 10%, while the use of thesaurus has a low impact.
more …
By
GarzaFabre, Mario; Pulido, Gregorio Toscano; Coello, Carlos A. Coello
30 Citations
An important issue with Evolutionary Algorithms (EAs) is the way to identify the best solutions in order to guide the search process. Fitness comparisons among solutions in singleobjective optimization is straightforward, but when dealing with multiple objectives, it becomes a nontrivial task. Pareto dominance has been the most commonly adopted relation to compare solutions in a multiobjective optimization context. However, it has been shown that as the number of objectives increases, the convergence ability of approaches based on Pareto dominance decreases. In this paper, we propose three novel fitness assignment methods for manyobjective optimization. We also perform a comparative study in order to investigate how effective are the proposed approaches to guide the search in highdimensional objective spaces. Results indicate that our approaches behave better than six stateoftheart fitness assignment methods.
more …
By
Castelán, Mario; PuertoSouza, Gustavo A.; Horebeek, Johan
3 Citations
In this paper, we compare four different Subspace Multiple Linear Regression methods for 3D face shape prediction from a single 2D intensity image. This problem is situated within the low observationtovariable ratio context, where the sample covariance matrix is likely to be singular. Lately, efforts have been directed towards latentvariable based methods to estimate a regression operator while maximizing specific criteria between 2D and 3D face subspaces. Regularization methods, on the other hand, impose a regularizing term on the covariance matrix in order to ensure numerical stability and to improve the outoftraining error. We compare the performance of three latentvariable based and one regularization approach, namely, Principal Component Regression, Partial Least Squares, Canonical Correlation Analysis and Ridge Regression. We analyze the influence of the different latent variables as well as the regularizing parameters in the regression process. Similarly, we identify the strengths and weaknesses of both regularization and latentvariable approaches for the task of 3D face prediction.
more …
By
FraustoSolis, Juan; SoberonMainero, Xavier; LiñánGarcía, Ernesto
3 Citations
This paper presents a new approach named MultiQuenching Annealing (MQA) for the Protein Folding Problem (PFP). MQA has two phases: Quenching Phase (QP) and Annealing Phase (AP). QP is applied at extremely high temperatures when the higher energy variations can occur. AP searches for the optimal solution at high and low temperatures when the energy variations are not very high. The temperature during the QP is decreased by an exponential function. Both QP and AP are divided in several subphases to decrease the temperature parameter until a dynamic equilibrium is detected by measuring its quality solution. In addition, an efficient analytical method to tune the algorithm parameters is used. Experimentation presented in the paper shows that MQA can obtain high quality of solution for PFP.
more …
By
GonzálezCastro, Victor; MacKinnon, Lachlan M.; Pilar Angeles, María
In recent years there has been a growing interest in the research community in the utilisation of alternative data models that abandon the relational record storage and manipulation structure. The authors have already reported experimental considerations of the behaviour of nary Relational, BinaryRelational, Associative and Transrelational models within the context of Data Warehousing [1], [2], [3] to address issues of storage efficiency and combinatorial explosion through data repetition. In this paper we present the results obtained during the industrial usage of BinaryRelational model based DBMS within a reference architectural configuration. These industrial results are similar to the ones obtained during the experimental stage of this research at the University laboratory [4] where improvements on query speed, data load and considerable reductions on disk space are achieved. These industrial tests considered a wide set of industries: Manufacturing, Government, Retail, Telecommunications and Finance.
more …
By
ZatarainCabada, Ramón; BarrónEstrada, M. Lucia; ZepedaSánchez, Leopoldo; Sandoval, Guillermo; OsorioVelazquez, J. Moises; UriasBarrientos, J. E.
Show all (6)
1 Citations
The identification of the best learning style in an Intelligent Tutoring System must be considered essential as part of the success in the teaching process. In many implementations of automatic classifiers finding the right student learning style represents the hardest assignment. The reason is that most of the techniques work using expert groups or a set of questionnaires which define how the learning styles are assigned to students. This paper presents a novel approach for automatic learning styles classification using a Kohonen network. The approach is used by an author tool for building Intelligent Tutoring Systems running under a Web 2.0 collaborative learning platform. The tutoring systems together with the neural network can also be exported to mobile devices. We present different results to the approach working under the author tool.
more …
By
Osorio Galindo, Mauricio; Pascucci, Simone
We present results about the logical consequence test under classical logic w.r.t. the Theory of Parameterized Complexity and Computation [1]. We show how a normal logic program P can partitioned in subset of clauses such that we can define an algorithm proving sets of atoms which complexity is bounded by a relation exponential in terms of a fixed parameter k and polynomial on the original size of the problem, namely the size of P. As example of application we study the model checking problem w.r.t. the PStable semantics.
more …
By
Gómez, Victor; Mendoza, Sonia; Decouchant, Dominique; Rodríguez, José
Show all (4)
Ubiquitous computing integrates Internet/Intranet small integrated sensors as well as powerful and dynamic devices into the people’s working and domestic areas. An intelligent area contains many devices that provide information about the state of each artifact (e.g., power failure of the refrigerator) without user intervention. Service discovery systems are essential to achieve this sophistication as they allow services and users to discover, configure and communicate with other services and users. However, most of these systems only provide support for interaction between services and software clients. In order to cope with this limitation, the SEDINU system aims at supporting interactions between nomadic users and services provided by areas. As users may move within the organization from an area to another one in order to accomplish their tasks, this system also provides support for useruser interaction and collaboration under specific contexts (role, location and goals).
more …
By
Nieves, Juan Carlos; Osorio, Mauricio; Zepeda, Claudia
3 Citations
Extensionbased argumentation semantics is a successful approach for performing nonmonotonic reasoning based on argumentation theory. An interesting property of some extensionbased argumentation semantics is that these semantics can be characterized in terms of logic programming semantics. In this paper, we present novel results in this topic. In particular, we show that one can induce an argumentation semantics (that we call Stratified Argumentation Semantics) based on a logic programming semantics that is based on stratified minimal models. We show that the stratified argumentation semantics overcome some problems of extensionbased argumentation semantics based on admissible sets and we show that it coincides with the argumentation semantics CF2.
more …
By
Muhammad, Aslam; Martinez Enriquez, Ana Maria; EscaladaImaz, Gonzalo
The lack of an assistance support may result in disturbance of coauthors by beginners who ask for help when they are in trouble to produce or reuse shared resources. Additionally, collaborators may not be sure if their respective production is consistent with the collaborative common contribution. We tackle this issue by developing a group awareness knowledge based system that takes the responsibility to automatically evaluate and reuse mathematical formulae, and deduces which participant is a possible expert to help others.
more …
By
JuárezGonzález, Antonio; MontesyGómez, Manuel; VillaseñorPineda, Luis; OrtízArroyo, Daniel
Show all (4)
Some recent works have shown that the “perfect” selection of the best IR system per query could lead to a significant improvement on the retrieval performance. Motivated by this fact, in this paper we focus on the automatic selection of the best retrieval result from a given set of results lists generated by different IR systems. In particular, we propose five heuristic measures for evaluating the relative relevance of each result list, which take into account the redundancy and ranking of documents across the lists. Preliminary results in three different data sets, and considering 216 queries, are encouraging. They show that the proposed approach could slightly outperform the results from the best individual IR system in two out of three collections, but that it could significantly improve the average results of individual systems from all data sets. In addition, the achieved results indicate that our approach is a competitive alternative to traditional data fusion methods.
more …
By
Vázquez, Roberto A.; Sossa, Humberto
2 Citations
Morphological associative memories (MAMs) are a special type of associative memory which exhibit optimal absolute storage capacity and onestep convergence. This associative model substitutes the additions and multiplications by additions/subtractions and maximums/minimums. This type of associative model has been applied to different pattern recognition problems including face localization and reconstruction of gray scale images. Despite of his power, it has not been applied to problems involving truecolor patterns. In this paper we describe how a Morphological Heteroassociative Memory (MHAM) can be applied in problems that involve truecolor patterns. In addition, a study of the behavior of this associative model in the reconstruction of truecolor images is performed using a benchmark of 14400 images altered by different type of noises.
more …
By
Escalante, Hugo Jair; Gonzalez, Jesús A.; Hernández, Carlos A.; López, Aurelio; Montes, Manuel; Morales, Eduardo; Sucar, Luis E.; VillaseñorPineda, Luis
Show all (8)
2 Citations
This paper describes experimental results of two approaches to multimedia image retrieval: annotationbased expansion and late fusion of mixed methods. The former formulation consists of expanding manual annotations with labels generated by automatic annotation methods. Experimental results show that the performance of textbased methods can be improved with this strategy, specially, for visual topics; motivating further research in several directions. The second approach consists of combining the outputs of diverse image retrieval models based on different information. Experimental results show that competitive performance, in both retrieval and results diversification, can be obtained with this simple strategy. It is interesting that, contrary to previous work, the best results of the fusion were obtained by assigning a high weight to visual methods. Furthermore, a probabilistic modeling approach to resultdiversification is proposed; experimental results reveal that some modifications are needed to achieve satisfactory results with this method.
more …
By
Hervás, Ramón; Nava, Salvador W.; Chavira, Gabriel; Sánchez, Carlos; Bravo, José
Show all (5)
Intelligent environments are responsive and sensitive to the presence of people. Users are integrated into a digital atmosphere which is adaptative to their needs, habits and emotions. Under the Ambient Intelligence vision, our main objective is to offer visualization of information services. They are integrated into a digital environment with contents adapted to the situation of the context at all times. These services are offered through a process of user identification by modelling the context around it (combining Near Fear Communication –NFC and Radiofrequency Identification –RFID technologies). Moreover, by analyzing typical situations, we can define the “mosaic roles” that guide the generation process of Pervasive Display Systems.
more …
By
Carrillo, Maya; VillatoroTello, Esaú; LópezLópez, A.; Eliasmith, Chris; MontesyGómez, Manuel; VillaseñorPineda, Luis
Show all (6)
7 Citations
The bag of words representation (BoW), which is widely used in information retrieval (IR), represents documents and queries as word lists that do not express anything about context information. When we look for information, we find that not everything is explicitly stated in a document, so context information is needed to understand its content. This paper proposes the use of bag of concepts (BoC) and Holographic reduced representation (HRR) in IR. These representations go beyond BoW by incorporating context information to document representations. Both HRR and BoC are produced using a vector space methodology known as Random Indexing, and allow expressing additional knowledge from different sources. Our experiments have shown the feasibility of the representations and improved the mean average precision by up to 7% when they are compared with the traditional vector space model.
more …
By
Chávez, Edgar; Fraser, Maia; Tejeda, Héctor
1 Citations
For an adhoc network with n nodes, we propose a proactive routing protocol without routing tables which uses O(logn) bits per node for the location service tables. The algorithm is based on 1dimensional virtual coordinates, which we call labels. The decision of where to forward a packet is oblivious and purely local, depending only on the labels of the immediate neighbours and the label of the destination: the packet is forwarded to the neighbour whose label is closest to that of the destination. The algorithm is based on mapping the network to an ordered list where each node has one or more integer labels. This labeling can be produced by any arbitrary traversal of the network visiting all the nodes, in particular by a depthfirst search of a flood tree which gives a 2n length traversal. We show experimentally that, in terms of hop number, our routing algorithm is far superior to geographic protocols in randomly generated networks and for sparse networks produces routes of length very close to those of the shortest path.
more …
By
GarciaBaleon, H. A.; AlarconAquino, V.; Starostenko, O.
2 Citations
In this paper we report an approach for cryptographic key generation based on keystroke dynamics and the kmedoids algorithm. The stages that comprise the approach are trainingenrollment and user verification. The proposed approach is able to verify the identity of individuals offline avoiding the use of a centralized database. The performance of the proposed approach is assessed using 20 samples of keystroke dynamics from 20 different users. Simulation results show a false acceptance rate (FAR) of 5.26% and a false rejection rate (FRR) of 10%. The cryptographic key released by the proposed approach may be used in several encryption algorithms.
more …
By
Dalmau, Oscar; Rivera, Mariano
3 Citations
We propose a general Bayesian model for image segmentation with spatial coherence through a Markov Random Field prior. We also study variants of the model and their relationship. In this work we use the Matusita Distance, although our formulation admits other metricdivergences. Our main contributions in this work are the following. We propose a general MRFbased model for image segmentation. We study a model based on the Matusita Distance, whose solution is found directly in the discrete space with the advantage of working in a continuous space. We show experimentally that this model is competitive with other models of the state of the art. We propose a novel way to deal with nonlinearities (irrational) related with the Matusita Distance. Finally, we propose an optimization method that allows us to obtain a hard image segmentation almost in real time and also prove its convergence.
more …
By
PinedaBautista, Bárbara B.; CarrascoOchoa, Jesús Ariel; MartínezTrinidad, José Fco.
1 Citations
In this work, a new method for classspecific feature selection, which selects a possible different feature subset for each class of a supervised classification problem, is proposed. Since conventional classifiers do not allow using a different feature subset for each class, the use of a classifier ensemble and a new decision rule for classifying new instances are also proposed. Experimental results over different databases show that, using the proposed method, better accuracies than using traditional feature selection methods, are achieved.
more …
By
R., Laura Cruz; Gonzalez B., Juan J.; Orta, José F. Delgado; Arrañaga C., Barbara A.; Fraire H., Hector J.
Show all (5)
1 Citations
In this paper a hybrid ant colony system algorithm is presented. A new approach to update the pheromone trails, denominated learning levels, is incorporated. Learning levels is based on the distributed Qlearning algorithm, a variant of reinforcement learning, which is incorporated to the basic ant colony algorithm. The hybrid algorithm is used to solve the Vehicle Routing Problem with Time Windows. Experimental results with the Solomon’s dataset of instances reveal that learning levels improve execution time and quality, respect to the basic ant colony system algorithm, 0.15% for traveled distance and 0.6% in vehicles used. Now we are applying the hybrid ant colony system in other domains.
more …
By
GaliciaHaro, Sofia N.; Gelbukh, Alexander F.
This paper reports research on temporal expressions shaped by a common temporal expression for a period of years modified by an adverb of time. From a Spanish corpus we found that some of those phrases are agerelated expressions. To determine automatically the temporal phrases with such meaning we analyzed a bigger sample obtained from the Internet. We analyzed these examples to define the relevant features to support a learning method. We present some preliminary results when a decision tree is applied.
more …
By
Perez, Cynthia B.; Olague, Gustavo
This paper presents a new methodology based on Genetic Programming that aims to create novel mathematical expressions that could improve local descriptors algorithms. We introduce the RDGPILLUM descriptor operator that was learned with two image pairs considering rotation, scale and illumination changes during the training stage. Such descriptor operator has a similar performance to our previous RDGP descriptor proposed in [1], while outperforming the RDGP descriptor in object recognition application. A set of experimental results have been used to test our evolved descriptor against three stateoftheart local descriptors. We conclude that genetic programming is able to synthesize image operators that outperform significantly previous humanmade designs.
more …
By
Mejia, David A.; Morán, Alberto L.; Favela, Jesus; Ochoa, Sergio F.; Pino, José
Show all (5)
The evaluation of groupware systems is considered a complex activity, mainly due to the impact that this kind of tools could have in work practices, the multiples variables that influences the use and evaluation of them, as well as the expensive cost of time and resources required for an in situ evaluation. These reasons have complicated the generation of a generic guide for evaluating this type of tools. Some researchers in groupware evaluation have highlighted the need to evaluate groupware tools, according to the context and characteristics of those organizations in which these tools would be deployed. Thus, in this paper we present a process to evaluate a tool that supports informal collaboration in hospital. Due to nature of hospital work and the difficulty of performing an in situ evaluation, our proposal implies a multiphase evaluation process through the development lifecycle of the tool.
more …
By
Barceló, Grettel; Cendejas, Eduardo; Sidorov, Grigori; Bolshakov, Igor A.
Show all (4)
3 Citations
A task that has been widely studied in the field of natural language processing is the Named Entity Recognition (NER). A great number of approaches have been developed to deal with the identification and classification of named entity strings in specific and opendomains. Nevertheless, external modules have to be incorporated into many of the NER systems in order to solve the interpretation problems derived from proper nouns. In this article our focus will be on the study of ambiguity in Hispanic Nominal Sequences which constitution assumes three main problems: (1) the association of given names and/or surnames; (2) the composition of such elements by means of a connector; (3) and the duality of given name/surname. In order to analyze the magnitude of the problem, two gazetteers were made, one with 93998 given names and the other with 13779 surnames. The gazetteers entries were used as terminal symbols of the proposed grammar to determine the valid interpretations in the nominal sequences; this is done by means of an automatic labeling of all the elements the nominal sequences are made of.
more …
By
Torres, Jorge; Dodero, Juan Manuel
3 Citations
Educational metadata provide learning objects and designs with required information that is relevant to a learning situation. A learning design specifies how a learning process involves a set of people in specific groups and roles engaging learning activities with appropriate resources and services. These elements are usually described by using structured primitives of an Educational Modeling Language. Metadata records must explicitly provide a representation of the flow of learning activities and how learning resources and services are utilized. We have analyzed a number of common workflow patterns in order to extend current Educational Modeling Languages’ primitives used in complex learning flows. The information model of the Learning Process Execution and Composition Language is used as the basis to extend structured metadata required by such learning process descriptions.
more …
By
Arroyo, Gustavo; Ramos, J. Guadalupe; Tamarit, Salvador; Vidal, Germán
Show all (4)
We introduce a transformational approach to improve the first stage of offline partial evaluation of functional programs, the so called bindingtime analysis (BTA). For this purpose, we first introduce an improved defunctionalization algorithm that transforms higherorder functions into firstorder ones, so that existing techniques for termination analysis and propagation of bindingtimes of firstorder programs can be applied. Then, we define another transformation (tailored to defunctionalized programs) that allows us to get the accuracy of a polyvariant BTA from a monovariant BTA over the transformed program. Finally, we show a summary of experimental results that demonstrate the usefulness of our approach.
more …
By
Pulido, J. R. G.; Aréchiga, M. A.; Michel, E. M. R.; Reyes, G.; Zobin, V.
Show all (5)
Monitoring volcanic activity is a task that requires people from a number of disciplines. Infrastructure, on the other hand , has been built all over the world to keep track of these living earth entities, ie volcanoes. In this paper we present an approach that merges a number of computational tools and that may be incorporated to existing ones to predict important volcanic events. It mainly consists of applying artificial learning, ontology, and software agents for the analysis, organization, and use of volcanicdomain data for the communities of people, living nearby volcanoes, benefit. This proposal allows domain experts to have a view of the knowledge contained in and that can be extracted from the VolcanicDomain Digital Archives (VDDA). Specificdomain knowledge components with further processing, and by embedding them into the digital archive itself, can be shared with and manipulated by software agents. In this first study, we deal with the issue of applying SelfOrganizing Maps (SOM), to volcanodomain signals originated by the activity of the Volcano of Colima, Mexico. By applying this algorithm we have generated clusters of volcanic activity and can readily identify families of important events.
more …
By
Río, Manuel Beltrán; Cocho, Germinal
We trace the rank size distribution of notes in harmonic music, which on previous works we suggested was much better represented by the Twoparameter, first class Beta distribution than the customary power law, to the ranked mixing of distributions dictated by the harmonic and instrumental nature of the piece. The same representation is shown to arise in other fields by the same type of ranked shuffling of distributions. We include the codon content of intergenic DNA sequences and the ranked distribution of sizes of trees in a determined area as examples. We show that the fittings proposed increase their accuracy with the number of distributions that are mixed and ranked.
more …
By
GonzálezCastro, Victor; MacKinnon, Lachlan M.; Pilar Angeles, María
1 Citations
In the last few years the amount of data stored on computer systems is growing at an accelerated rate. These data are frequently managed within data warehouses. However, the current data warehouse architectures based on naryRelational DBMSs are overcoming their limits in order to efficiently manage such large amounts of data. Some DBMS are able to load huge amounts of data nevertheless; the response times become unacceptable for business users during information retrieval. In this paper we describe an alternative data warehouse reference architectural configuration (ADW) which addresses many issues that organisations are facing. The ADW approach considers a BinaryRelational DBMS as an underlying data repository. Therefore, a number of improvements have been achieved, such as data density increment, reduction of data sparsity, query response times dramatically decreased, and significant workload reduction with data loading, backup and restore tasks.
more …
By
AyaneguiSantiago, Huberto; ReyesGalaviz, Orion F.; ChávezAragón, Alberto; RamírezCruz, Federico; Portilla, Alberto; GarcíaBañuelos, Luciano
Show all (6)
1 Citations
Scientific communities around the world are increasingly paying more attention to collaborative networks to ensure they remain competitive, the Computer Science (CS) community is not an exception. Discovering collaboration opportunities is a challenging problem in social networks. Traditional social network analysis allows us to observe which authors are already collaborating, how often they are related to each other, and how many intermediaries exist between two authors. In order to discover the potential collaboration among Mexican CS scholars we built a social network, containing data from 1960 to 2008. We propose to use a clustering algorithm and social network analysis to identify scholars that would be advisable to collaborate. The idea is to identify clusters consisting of authors who are completely disconnected but with opportunities of collaborating given their common research areas. After having clustered the initial social network we built, we analyze the collaboration networks of each cluster to discover new collaboration opportunities based on the conferences where the authors have published. Our analysis was made based on the largescale DBLP bibliography and the census of Mexican scholars made by REMIDEC.
more …
By
Zanella, Vittorio; Ramirez, Geovany; Vargas, Héctor; Rosas, Lorna V.
Show all (4)
3 Citations
Image metamorphosis, commonly known as morphing, is a powerful tool for visual effects that consists of the fluid transformation of one digital image into another. There are many techniques for image metamorphosis, but in all of them is a person who supplies the correspondence between the features in the source image and target image. In this paper we use a method to find the faces in the image and the Active Shape Models to find the features in the face images and perform the metamorphosis of face images in frontal view automatically.
more …
By
AyalaRaggi, Salvador E.; AltamiranoRobles, Leopoldo; CruzEnriquez, Janeth
We present a fast and robust iterative method for interpreting face images under nonuniform lighting conditions by using a fitting algorithm which utilizes an illuminationbased 3D active appearance model in order to fit a face model to an input face image. Our method is based on improving the Jacobian each iteration using the parameters of lighting that have been estimated in preceding iterations. In the training stage, we precalculate a set of synthetic face images of basis reflectances and albedo generated from displacing one at the time, each one of the model parameters, and subsequently, in the fitting stage, we use all these images in combination with lighting parameters for assembling a Jacobian matrix adapted to the illumination estimated in the last iteration. In contrast to other works where an initial pose is required to begin the fit, our approach only uses a simple initialization in translation and scale. At the end of the fitting process, our algorithm obtains a compact set of parameters of albedo, 3D shape, 3D pose and illumination which describe the appearance of the input face image.
more …
By
AguilarGonzález, Pablo M.; Kober, Vitaly
1 Citations
Correlation filters for recognition of a target in nonoverlapping background noise are proposed. The object to be recognized is given implicitly; that is, it is placed in a noisy reference image at unknown coordinates. For the filters design two performance criteria are used: signaltonoise ratio and peaktooutput energy. Computer simulations results obtained with the proposed filters are discussed and compared with those of classical correlation filters in terms of discrimination capability.
more …
By
Vázquez, Roberto A.; Sossa, Humberto; Garro, Beatriz A.
2 Citations
Recently, it was shown how some metaphors, adopted from the infant vision system, were useful for face recognition. In this paper we adopt those biological hypotheses and apply them to the 3D object recognition problem. As the infant vision responds to low frequencies of the signal, a lowfilter is used to remove high frequency components from the image. Then we detect subtle features in the image by means of a random feature selection detector. At last, a dynamic associative memory (DAM) is fed with this information for training and recognition. To test the accuracy of the proposal we use the Columbia Object Image Library (COIL 100).
more …
By
Viveros Jiménez, Francisco; MezuraMontes, Efrén; Gelbukh, Alexander
1 Citations
A new evolutionary algorithm, Elitistic Evolution (termed EEv), is proposed in this paper. EEv is an evolutionary method for numerical optimization with adaptive behavior. EEv uses small populations (smaller than 10 individuals). It have an adaptive parameter to adjust the balance between global exploration and local exploitation. Elitism have great influence in EEv’ proccess and that influence is also controlled by the adaptive parameter. EEv’ crossover operator allows a recently generated offspring individual to be parent of other offspring individuals of its generation. It requires the configuration of two user parameters (many stateoftheart approaches uses at least three). EEv is tested solving a set of 16 benchmark functions and then compared with Differential Evolution and also with some wellknown Memetic Algorithms to show its efficiency. Finally, EEv is tested solving a set of 10 benchmark functions with very high dimensionality (50, 100 and 200 dimensions) to show its robustness.
more …
By
Venegas, Héctor A. Montes; MarcialRomero, J. Raymundo
We present preliminary results of a path planner for two robotic arms sharing the same workspace. Unlike many other approaches, our planner finds collisionfree paths using the robot’s cartesian space as a tradeoff between completeness and no workspace preprocessing. Given the high dimensionality of the search space, we use a two phase Genetic Algorithm to find a suitable path in workspaces cluttered with obstacles. Because the length of the path is unknown in advance, the planner manipulates a flexible and well crafted representation which allows the path to grow or shrink during the search process. The performance of our planner was tested on several scenarios where the only moving objects were the two robotic arms. The test scenarios force the manipulators to move through narrow spaces for which suitable and safe paths were found by the planner.
more …
By
Gamboa, Ariel García; Gress, Neil Hernández; Mendoza, Miguel González; Vargas, Jaime Mora
Show all (4)
This paper presents a strategy to optimize the learning phase of the Support Vector Machines algorithm (SVM). The SVM algorithm is widely used in solving different tasks like classification, regression, density estimation and clustering problems. However, the algorithm presents important disadvantages when learning large scale problems. Training a SVM involves finding the solution of a quadratic optimization problem (QP), which is very resource consuming. What is more, during the learning step, the best working set must be selected, which is a hard to perform task. In this work, we combine a heuristic approach, which selects the best working set data, with a projected conjugate gradient method, which is a fast and easy to implement algorithm that solves the quadratic programming problem involved in the SVM algorithm. We compare the performances of the optimization strategies using some wellknown benchmark databases.
more …
By
Perez, Gerardo; Mejia, Yuridia P.; Olmos, Ivan; Gonzalez, Jesus A.; Sánchez, Patricia; Vázquez, Candelario
Show all (6)
1 Citations
In this paper we present a new algorithm to find inexact motifs (which are transformed into a set of exact subsequences) from a DNA sequence. Our algorithm builds an automaton that searches for the set of exact subsequences in the DNA database (that can be very long). It starts with a preprocessing phase in which it builds the finite automaton, in this phase it also considers the case in which two different subsequences share a substring (in other words, the subsequences might overlap), this is implemented in a similar way as the KMP algorithm. During the searching phase, the algorithm recognizes all instances in the set of input subsequences that appear in the DNA sequence. The automaton is able to perform the search phase in linear time with respect to the dimension of the input sequence. Experimental results show that the proposed algorithm performs better than the AhoCorasick algorithm, which has been proved to perform better than the naive approach, even more; it is considered to run in linear time.
more …
By
GarciaVazquez, Juan P.; Rodriguez, Marcela D.; Andrade, Angel G.
1 Citations
During aging, older adults presents loss in their functional capabilities. This may cause that older adults do not continue performing their activities of daily living independently at home. We propose Ambient Information Systems (AIS) as the appropriate pervasive devices that can enrich the elders’ activities and promote their autonomy during the execution of their tasks. For illustrating this, in this paper we present an AIS for supporting medicine administration. By designing AIS as the presented in this paper, we have identified design issues and characteristics to be incorporated in AIS for supporting elder’s autonomy in their homes.
more …
By
Bayro Kaiser, Esteban Tobias; CorreaArameda, Eduardo; BayroCorrochano, Eduardo
This paper presents a gray scale image compression method using the Wavelet Transform and key feature detection for mobile phone video transmission. The major contribution of this work is to show the application of the wavelet transform in image compression and to add a new method to reduce redundant information in video transmission which is key feature detection. An algorithm is designed in Matlab to accomplish this task using a face to face video.
more …
By
Chakraborty, Debrup
In this work we propose a new method to create neural network ensembles. Our methodology develops over the conventional technique of bagging, where multiple classifiers are trained using a single training data set by generating multiple bootstrap samples from the training data. We propose a new method of sampling using the knearest neighbor density estimates. Our sampling technique gives rise to more variability in the data sets than by bagging. We validate our method by testing on several real data sets and show that our method outperforms bagging.
more …
By
Bravo, Jose; Hervas, Ramon; Fuentes, Carmen; Villarreal, Vladimir; Chavira, Gabriel; Nava, Salvador; Fontecha, Jesus; Casero, Gregorio; Peña, Rocio; Vergara, Marcos
Show all (10)
Intelligent environments need interactions capable of detecting users and providing them with goodquality contextual information. In this sense we adapt technologies, identifying and locating people for supporting their needs. However, it is necessary to analyze some important features in order to compare the implicit interaction, which is closer to the users and more natural, to a new interaction by contact. In this paper we present the adaptability of two technologies; Radiofrequency Identification (RFID) and Near Field Communication (NFC). In the first one, the interaction is more appropriate within intelligent environments but in the second one, the same RFID technology, placed in mobile phones, achieves some advantages that we consider to be an intermediate solution until the standardization of sensors arrives.
more …
By
Hernandez, Josue; Morita, Hiroshi; NakanoMiytake, Mariko; PerezMeana, Hector
Show all (4)
The use of image processing schemes as part of the security systems have been increasing, to detect, classify as well as to tract object and human motion with a high precision. To this end several approaches have been proposed during the last decades using image processing techniques, because computer vision let us to manipulated digital image sequences to extract useful information contained in a video stream. In this paper we present a motion detection algorithm using the movement vectors estimation which are subsequently filtered to obtain better information about real motion into a given scenes. Experimental results show that the accuracy of proposed system.
more …
By
SantiagoSánchez, Karen; ReyesGarcía, Carlos A.; GómezGil, Pilar
8 Citations
Crying is an acoustic event that contains information about the functioning of the central nervous system, and the analysis of the infant´s crying can be a support in the distinguishing diagnosis in cases like asphyxia and hyperbilirrubinemia. The classification of baby cry has been intended by the use of different types of neural networks and other recognition approaches. In this work we present a pattern classification algorithm based on fuzzy logic Type 2 with which the classification of infant cry is realized. Experiments as well as results are also shown.
more …
By
Escalante, Hugo Jair; Montes, Manuel; Villaseñor, Luis
14 Citations
Authorship verification is the task of determining whether documents were or were not written by a certain author. The problem has been faced by using binary classifiers, one per author, that make individual yes/no decisions about the authorship condition of documents. Traditionally, the same learning algorithm is used when building the classifiers of the considered authors. However, the individual problems that such classifiers face are different for distinct authors, thus using a single algorithm may lead to unsatisfactory results. This paper describes the application of particle swarm model selection (PSMS) to the problem of authorship verification. PSMS selects an adhoc classifier for each author in a fully automatic way; additionally, PSMS also chooses preprocessing and feature selection methods. Experimental results on two collections give evidence that classifiers selected with PSMS are advantageous over selecting the same classifier for all of the authors involved.
more …
By
RangelValdez, Nelson; TorresJimenez, Jose
1 Citations
It is known that some NPComplete problems exhibit sharp phase transitions with respect to some order parameter. Moreover, a correlation between that critical behavior and the hardness of finding a solution exists in some of these problems. This paper shows experimental evidence about the existence of a critical behavior in the computational cost of solving the bandwidth minimization problem for graphs (BMPG). The experimental design involved the density of a graph as order parameter, 200000 random connected graphs of size 16 to 25 nodes, and a branch and bound algorithm taken from the literature. The results reveal a bimodal phase transition in the computational cost of solving the BMPG instances. This behavior was confirmed with the results obtained by metaheuristics that solve a known BMPG benchmark.
more …
By
MorenoMontiel, C. H.; RojasGonzález, F.; RománAlonso, G.; CorderoSánchez, S.; CastroGarcía, M. A.; AguilarCornejo, M.
Show all (6)
4 Citations
A parallel simulator, based on the Dual SiteBond Model of complex media, is developed to study Hg intrusion and extrusion processes in the myriad of voids contained in a porous network. In order to reduce the requirements in RAM and computing resources, the porous network is partitioned into several subnetworks distributed in different cluster processors. The simulator uses shared memory to process (with OpenMP) each subnetwork and applies a message passing protocol (with MPI) to allow communication among different processors. We show experimental results that reflect a good performance of our proposal when using different sizes of porous networks in a cluster with 32 nodes, each one having 4 processors.
more …
By
García, V.; Mollineda, R. A.; Sánchez, J. S.
15 Citations
This paper introduces a new metric, named Index of Balanced Accuracy, for evaluating learning processes in twoclass imbalanced domains. The method combines an unbiased index of its overall accuracy and a measure about how dominant is the class with the highest individual accuracy rate. Some theoretical examples are conducted to illustrate the benefits of the new metric over other wellknown performance measures. Finally, a number of experiments demonstrate the consistency and validity of the evaluation method here proposed.
more …
By
Lizárraga, Giovanni; Gomez, Marco Jimenez; Castañon, Mauricio Garza; AcevedoDavila, Jorge; Rionda, Salvador Botello
Show all (5)
1 Citations
When evaluating the quality of non–dominated sets, two families of quality indicators are frequently used: unary quality indicators (UQI) and binary quality indicators (BQI). For several years, UQIs have been considered inferior to BQIs. As a result, the use of UQIs has been discouraged, even when in practice they are easier to use. In this work, we study the reasons why UQIs are considered inferior. We make a detailed analysis of the correctness of these reasons and the implicit assumptions in which they are based. The conclusion is that, contrary to what is widely believed, unary quality indicators are not inferior to binary ones.
more …
By
Calvo, Hiram; Inui, Kentaro; Matsumoto, Yuji
1 Citations
We propose a model based on the Word Space Model for calculating the plausibility of candidate arguments given one verb and one argument. The resulting information can be used in coreference resolution, zeropronoun resolution or syntactic ambiguity tasks. Previous work such as Selectional Preferences or Semantic Frames acquisition focuses on this task using supervised resources, or predicting arguments independently from each other. On this work we explore the extraction of plausible arguments considering their corelation, and using no more information than that provided by the dependency parser. This creates a data sparseness problem alleviated by using a distributional thesaurus built from the same data for smoothing. We compare our model with the traditional PLSI method.
more …
By
Puente, Cesar; Olague, Gustavo; Smith, Stephen V.; Bullock, Stephen H.; GonzálezBotello, Miguel A.; HinojosaCorona, Alejandro
Show all (6)
3 Citations
Today the most popular method for the extraction of vegetation information from remote sensing data is through vegetation indices. In particular, erosion models are based on vegetation indices that are used to estimate the “cover factor” (C) defined by healthy, dry, or dead vegetation in a popular soil erosion model named RUSLE, (“Revised Universal Soil Loss Equation”). Several works correlate vegetation indices with C in order to characterize a broad area. However, the results are in general not suitable because most indices focus only on healthy vegetation. The aim of this study is to devise a new approach that automatically creates vegetation indices that include dry and dead plants besides healthy vegetation. For this task we propose a novel methodology based on Genetic Programming (GP) as summarized below. First, the problem is posed as a search problem where the objective is to find the index that correlates best with on field C factor data. Then, new indices are built using GP working on a set of numerical operators and bands until the best composite index is found. In this way, GP was able to develop several new indices that are better correlated compared to traditional indices such as NDVI and SAVI family. It is concluded with a real world example that it is viable to synthesize indices that are optimally correlated with the C factor using this methodology. This gives us confidence that the method could be applied in soil erosion assessment.
more …
By
QuintanillaDomínguez, J.; CortinaJanuchs, M. G.; BarrónAdame, J. M.; VegaCorona, A.; BuendíaBuendía, F. S.; Andina, D.
Show all (6)
Breast cancer is one of the leading causes to women mortality in the world. Cluster of Microcalcifications (MCC) in mammograms can be an important early sign of breast cancer, the detection is important to prevent and treat the disease. In this paper, we present a novel method for the detection of MCC in mammograms which consists of image enhancement by histogram adaptive equalization technique, MCC edge detection by Coordinate Logic Filters (CLF), generation, clustering and labelling of suboptimal features vectors by means of Self Organizing Map (SOM) Neural Network. Like comparison we applied an unsupervised clustering Kmeans in the stage of labelling of our method. In the labelling stage, we obtain better results with the proposed SOM Neural Network compared with the kmeans algorithm. Then, we show that the proposed method can locate MCCs in an efficient way.
more …
By
Valdez, Fevrier; Melin, Patricia; Castillo, Oscar
1 Citations
We describe in this paper a new hybrid approach for mathematical function optimization combining Particle Swarm Optimization (PSO) and Genetic Algorithms (GAs) using Fuzzy Logic to integrate the results. The new evolutionary method combines the advantages of PSO and GA to give us an improved PSO+GA hybrid method. Fuzzy Logic is used to combine the results of the PSO and GA in the best way possible. The new hybrid PSO+GA approach is compared with the PSO and GA methods with a set of benchmark mathematical functions. The new hybrid PSO +GA method is shown to be superior than the individual evolutionary methods.
more …
By
RebolledoMendez, Genaro; Dunwell, Ian; MartínezMirón, Erika A.; VargasCerdán, María Dolores; Freitas, Sara; Liarokapis, Fotis; GarcíaGaona, Alma R.
Show all (7)
30 Citations
This paper presents the results of a usability evaluation of the NeuroSky’s MindSet (MS). Until recently most Brain Computer Interfaces (BCI) have been designed for clinical and research purposes partly due to their size and complexity. However, a new generation of consumeroriented BCI has appeared for the video game industry. The MS, a headset with a single electrode, is based on electroencephalogram readings (EEG) capturing faint electrical signals generated by neural activity. The electrical signal across the electrode is measured to determine levels of attention (based on Alpha waveforms) and then translated into binary data. This paper presents the results of an evaluation to assess the usability of the MS by defining a model of attention to fuse attention signals with usergenerated data in a Second Life assessment exercise. The results of this evaluation suggest that the MS provides accurate readings regarding attention, since there is a positive correlation between measured and selfreported attention levels. The results also suggest there are some usability and technical problems with its operation. Future research is presented consisting of the definition a standardized reading methodology and an algorithm to level out the natural fluctuation of users’ attention levels if they are to be used as inputs.
more …
By
Cruz, Benjamín; Sossa, Humberto; Barrón, Ricardo
1 Citations
Associative memories (AM’s) have been extensively used during the last 40 years for pattern classification and pattern restoration. In this paper Conformal Geometric Algebra (CGA) is used to develop a new associative memory (AM). The proposed AM makes use of CGA and quadratic programming to store associations among patterns and their respective classes. An unknown pattern is classified by applying an inner product between the pattern and the build AM. Numerical and real examples are presented to show the potential of the proposal.
more …
By
AvilésLópez, Edgardo; GarcíaMacías, J Antonio
61 Citations
Wireless sensor networks provide the means for gathering vast amounts of data from physical phenomena, and as such they are being used for applications such as precision agriculture, habitat monitoring, and others. However, there is a need to provide higher level abstractions for the development of applications, since accessing the data from wireless sensor networks currently implies dealing with very lowlevel constructs. We propose TinySOA, a service oriented architecture that allows programmers to access wireless sensor networks from their applications by using a simple serviceoriented API via the language of their choice. We show an implementation of TinySOA and the results of an experiment where programmers developed an application that exemplifies how easy Internet applications can integrate sensor networks.
more …
By
Avenel, Christophe; Mémin, Etienne; Pérez, Patrick
4 Citations
The joint analysis of motions and deformations is crucial in a number of computer vision applications. In this paper, we introduce a nonlinear stochastic filtering technique to track the state of a free curve. The approach we propose is implemented through a particle filter which includes color measurements characterizing the target and the background respectively. We design a continuoustime dynamics that allows us to infer interframe deformations. The curve is defined by an implicit levelset representation and the stochastic dynamics is expressed on the levelset function. It takes the form of a stochastic differential equation with Brownian motion of low dimension. Specific noise models lead to traditional evolution laws based on mean curvature motions, while other forms lead to new evolution laws with different smoothing behaviors. In these evolution models, we propose to combine local motion information extracted from the images and an incertitude modeling of the dynamics. The associated filter we propose for curve tracking thus belongs to the family of conditional particle filters. Its capabilities are demonstrated on various sequences with highly deformable objects.
more …
By
SalinasGutiérrez, Rogelio; HernándezAguirre, Arturo; VillaDiharce, Enrique R.
10 Citations
A new way of modeling probabilistic dependencies in Estimation of Distribution Algorithm (EDAs) is presented. By means of copulas it is possible to separate the structure of dependence from marginal distributions in a joint distribution. The use of copulas as a mechanism for modeling joint distributions and its application to EDAs is illustrated on several benchmark examples.
more …
By
VargasGovea, Blanca; Morales, Eduardo F.
1 Citations
Many tasks can be described by sequences of actions that normally exhibit some form of structure and that can be represented by a grammar. This paper introduces FOSeq, an algorithm that learns grammars from sequences of actions. The sequences are given as lowlevel traces of readings from sensors that are transformed into a relational representation. Given a transformed sequence, FOSeq identifies frequent subsequences of nitems, or ngrams, to generate new grammar rules until no more frequent ngrams can be found. From m sequences of the same task, FOSeq generates m grammars and performs a generalization process over the best grammar to cover most of the sequences. The grammars induced by FOSeq can be used to perform a particular task and to classify new sequences. FOSeq was tested on robot navigation tasks and on gesture recognition with competitive performance against other approaches based on Hidden Markov Models.
more …
By
AranaDaniel, Nancy; LópezFranco, Carlos; BayroCorrochano, Eduardo
1 Citations
This paper presents an improvement of a recurrent learning system called LSTMCSVM (introduced in [1]) for robot navigation applications, this approach is used to deal with some of the main issues addressed in the research area: the problem of navigation on large domains, partial observability, limited number of learning experiences and slow learning of optimal policies. The advantages of this new version of LSTMCSVM system, are that it can find optimal paths through mazes and it reduces the number of generations to evolve the system to find the optimal navigation policy, therefore either the training time of the system is reduced. This is done by adding an heuristic methodoly to find the optimal path from start state to the goal state.can contain information about the whole environment or just partial information about it.
more …
By
Flores, Juan J.; Loaeza, Roberto; Rodríguez, Héctor; Cadenas, Erasmo
Show all (4)
4 Citations
The design of models for time series prediction has found a solid foundation on statistics. Recently, artificial neural networks have been a good choice as approximators to model and forecast time series. Designing a neural network that provides a good approximation is an optimization problem. Given the many parameters to choose from in the design of a neural network, the search space in this design task is enormous. When designing a neural network by hand, scientists can only try a few of them, selecting the best one of the set they tested. In this paper we present a hybrid approach that uses evolutionary computation to produce a complete design of a neural network for modeling and forecasting time series. The resulting models have proven to be better than the ARIMA and the handmade artificial neural network models.
more …
By
Pazos R., Rodolfo A.; Martínez F., José A.; González B., Juan J.; MoralesRodríguez, María Lucila; Rojas P., Jessica C.
Show all (5)
In this paper a method is presented which permits to automatically extract lexicalsemantic relations between nouns (specifically for concrete nouns since they have a well structured taxonomy). From the definitions of the entries in a Spanish dictionary, the hypernym of an entry is extracted from the entry definition according to the basic assumption that the first noun in the definition is the entry hypernym. After obtaining the hypernym for each entry, multilayered hyponymyhyperonymy relations are generated from a noun, which is considered the root of the domain. The domains for which this approach was tested were zoology and botany. Five levels of hyponymyhypernymy relations were generated for each domain. For the zoology domain a total of 1,326 relations was obtained with an average percentage of correctly generated relations (precision) of 84.31% for the five levels. 91.32% of all the relations of this domain were obtained in the first three levels, and for each of these levels the precision exceeds 96%. For the botany domain a total of 1,199 relations was obtained, with an average precision of 71.31% for the five levels. 90.76% of all the relations of this domain were obtained in the first level, and for this level the precision exceeds 99%.
more …
By
BrachoRios, Josue; TorresJimenez, Jose; RodriguezTello, Eduardo
6 Citations
A Covering Array denoted by CA(N;t,k,v) is a matrix of size N ×k, in which each of the v^{t} combinations appears at least once in every t columns. Covering Arrays (CAs) are combinatorial objects used in software testing. There are different methods to construct CAs, but as it is a highly combinatorial problem, few complete algorithms to construct CAs have been reported. In this paper a new backtracking algorithm based on the Branch & Bound technique is presented. It searches only nonisomorphic Covering Arrays to reduce the search space of the problem of constructing them. The results obtained with this algorithm are able to match some of the best known solutions for small instances of binary CAs.
more …
By
Suárez, Airel Pérez; Trinidad, José Fco. Martínez; Ochoa, Jesús A. Carrasco; Medina Pagola, José E.
Show all (4)
3 Citations
In this paper, a new algorithm for incremental overlapped clustering, called Incremental Clustering by Strength Decision (ICSD), is introduced. ICSD obtains a set of dense and overlapped clusters using a new graph cover heuristic while reduces the amount of computation by maintaining incrementally the cluster structure. The experimental results show that our proposal outperforms other graphbased clustering algorithms considering quality measures and also show that ICSD achieves a better time performance than other incremental graphbased algorithms.
more …
By
Wörsdörfer, Florian; Stock, Florian; BayroCorrochano, Eduardo; Hildenbrand, Dietmar
Show all (4)
1 Citations
The usage of Conformal Geometric Algebra leads to algorithms that can be formulated in a very clear and easy to grasp way. But it can also increase the performance of an implementation because of its capabilities to be computed in parallel. In this paper we show how a grasping algorithm for a robotic arm is accelerated using a Conformal Geometric Algebra formulation. The optimized C code is produced by the CGA framework Gaalop automatically. We compare this implementation with a CUDA implementation and an implementation that uses standard vector algebra.
more …
By
Zaragoza, Julio H.; Morales, Eduardo F.
Reinforcement Learning is a commonly used technique in robotics, however, traditional algorithms are unable to handle large amounts of data coming from the robot’s sensors, require long training times, are unable to reuse learned policies on similar domains, and use discrete actions. This work introduces TSRRLCA, a two stage method to tackle these problems. In the first stage, lowlevel data coming from the robot’s sensors is transformed into a more natural, relational representation based on rooms, walls, corners, doors and obstacles, significantly reducing the state space. We also use Behavioural Cloning, i.e., traces provided by the user to learn, in few iterations, a relational policy that can be reused in different environments. In the second stage, we use Locally Weighted Regression to transform the initial policy into a continuous actions policy. We tested our approach with a real service robot on different environments for different navigation and following tasks. Results show how the policies can be used on different domains and perform smoother, faster and shorter paths than the original policies.
more …
By
QuintanillaDomínguez, J.; OjedaMagaña, B.; Seijas, J.; VegaCorona, A.; Andina, D.
Show all (5)
Breast cancer is one of the leading causes to women mortality in the world. Clusters of Microcalcifications (MCCs) in mammograms can be an important early sign of breast cancer, the detection is important to prevent and treat the disease. Coordinate Logic Filters (CLF), are very efficient in digital signal processing applications, such as noise removal, magnification, opening, closing, skeletonization, and coding, as well as in edge detection, feature extraction, and fractal modelling. This paper presents an edge detector of MCCs in Regions of Interest (ROI) from mammograms using a novel combination. The edge detector consist in the combination of image enhancement by histogram adaptive technique, a Self Organizing Map (SOM) Neural Network and CLF. The experiment results show that the proposed method can locate MCCs edges. Moreover, the proposed method is quantitatively evaluated by Pratt’s figure of merit together with two widely used edge detectors and visually compared, achieving the best results.
more …
By
CruzBarbosa, Raúl; Vellido, Alfredo
In many real problems that ultimately require data classification, not all the class labels are readily available. This concerns the field of semisupervised learning, in which missing class labels must be inferred from the available ones as well as from the natural cluster structure of the data. This structure can sometimes be quite convoluted. Previous research has shown the advantage, for these cases, of using the geodesic metric in clustering models of the manifold learning family to reveal the underlying true data structure. In this brief paper, we present a novel semisupervised approach, namely SemiSupervised GeoGTM (SSGeoGTM). This is an extension of GeoGTM, a variation on the Generative Topographic Mapping (GTM) manifold learning model for data clustering and visualization that resorts to the geodesic metric. SSGeoGTM uses a proximity graph built from GeoGTM manifold as the basis for a label propagation algorithm that infers missing class labels. Its performance is compared to those of a semisupervised version of the standard GTM and of the alternative Laplacian Eigenmaps method.
more …
By
SánchezCruz, Hermilo; RodríguezDíaz, Mario A.
1 Citations
This is an extension of the paper appeared in [15]. This time, we compare four methods: Arithmetic coding applied to 3OT chain code (Arith3OT), Arithmetic coding applied to DFCCE (ArithDFCCE), Huffman coding applied to DFCCE chain code (HuffDFCCE), and, to measure the efficiency of the chain codes, we propose to compare the methods with JBIG, which constitutes an international standard. In the aim to look for a suitable and better representation of contour shapes, our probes suggest that a sound method to represent contour shapes is 3OT, because Arithmetic coding applied to it gives the best results regarding JBIG, independently of the perimeter of the contour shapes.
more …
By
AvilésLópez, Edgardo; GarcíaMacías, J. Antonio
1 Citations
The current Web 2.0 stage of the Internet provided the basis for webbased communities and services aimed at collaboration and information sharing. Furthermore, Internet is now an application platform in which Web applications can be integrated to provide augmented services that could bring the basis for ubiquitous computing scenarios. Recently, the concept of mashups has been used to refer to applications built upon the integration and combination of public Web API’s and data sources. Ubiquitous computing mashups go further by combining the functionality of both software and hardware components in an attempt to exploit computation and services provided by everyday objects. Typically, developing a mashup requires highly specialized knowledge in many topics (such as using different programming interfaces and languages). This problem is greatly magnified in developing mashups of both physical and digital services due to the various integration and communication issues. We exemplify these concepts through the use of UbiSOA Editor, a system that allows the creation of ubiquitous computing mashups through simple activities such as dragging and dropping graphical representations of the involved services in a desired scenario. Then we talk about the planning and execution of a sample scenario as a showcase of what can be easily accomplished.
more …
By
Robles, Guillermo Cortes; Hernández, Giner Alor; Lasserre, Alberto Aguilar; Martínez, Ulises Juárez; Gomez, Ruben Posada; Gomez, Juan Miguel; González, Alejandro Rodríguez
Show all (7)
The application of CaseBased Reasoning (CBR) has proved its efficacy as a pragmatic approach to assist problem solving activities, to construct knowledge based decision systems and to support the organizational learning process. Nevertheless its application in innovative design, an activity that involves knowledge, problem solving activities, creativity and social interaction still being poor exploited. In this document, CBR is connected to the Theory of Inventive Problem Solving or TRIZ theory to propose a synergy capable to assist the innovation process. The synergy makes use of several TRIZ concepts, but in the present context, the relevance of available resource in a technical system as vector to drive problem solving activities and to transfer knowledge is emphasized.
more …
By
CruzBarbosa, Raúl; Vellido, Alfredo
The diagnosis and prognosis of human brain tumours, especially when they are aggresive, are sensitive clinical tasks that usually require noninvasive measurement techniques. Outcome information for aggressive tumours, in particular, is usually scarce. In this paper, we aim to gauge the capability of a novel semisupervised model, SSGeoGTM, to infer outcome stages from a very limited amount of available stage labels and Magnetic Resonance Spectroscopy (MRS) data corresponding to Glioblastoma, which is an aggressive tumor type. This model stems from a geodesic distancebased extension of Generative Topographic Mapping (GeoGTM) that prioritizes neighbourhood relationships along a generated manifold embedded in the observed data space.
more …
By
Rosenblueth, David A.; Stephens, Christopher R.
Recombination is an important operator in the evolution of biological organisms and has also played an important role in Evolutionary Computation. In neither field however, is there a clear understanding of why recombination exists and under what circumstances it is useful. In this paper we consider the utility of recombination in the context of a simple Genetic Algorithm (GA). We show how its utility depends on the particular landscape considered. We also show how the facility with which this question may be addressed depends intimately on the particular representation used for the population in the GA, i.e., a representation in terms of genotypes, Building Blocks or Walsh modes. We show how, for nonepistatic landscapes, a description in terms of Building Blocks manifestly shows that recombination is always beneficial, leading to a “royal road” towards the optimum, while the contrary is true for highly epistatic landscapes such as “needleinahaystack”.
more …
By
Alarcón, Rodrigo; Sierra, Gerardo; Bach, Carme
2 Citations
Terminological work aims to identify knowledge about terms in specialised texts in order to compile dictionaries, glossaries or ontologies. Searching for definitions about the terms that terminographers intend to define is therefore an essential task. This search can be done in specialised corpus, where they usually appear in definitional contexts, i.e. text fragments where an author explicitly defines a term. We present a research focused on the automatic extraction of those definitional contexts. The methodology includes three different processes: the extraction of definitional patterns, the automatic filtering of nonrelevant contexts, and the automatic identification of constitutive elements, i.e., terms and definitions.
more …
By
Sagols, Feliú; Marín, Raúl
1 Citations
The Inscribed Square Conjecture has been open since 1911. It states that any plane Jordan curve J contains four points on a nondegenerate square. In this article we prove that the conjecture holds for digital simple closed 4curves, and that it is false for 8curves. The given proof is based on a theorem due to Stromquist. We also discuss some properties of simple closed 4curves in the digital plane containing a single nondegenerate inscribed square.
more …
By
FraustoSolís, Juan; GonzálezMendoza, Miguel; LópezDíaz, Roberto
In this paper, the application of Wolfe’s method in Support Vector Machines learning stage is presented. This stage is usually performed by solving a quadratic programming problem and a common approach for solving it, is breaking down that problem in smaller subproblems easier to solve and manage. In this manner, instead of dividing the problem, the application of Wolfe’s method is proposed. The method transforms a quadratic programming problem into an Equivalent Linear Model and uses a variation of simplex method employed in linear programming. The proposed approach is compared against QuadProg Matlab function used to solve quadratic programming problems. Experimental results show that the proposed approach has better quality of classification compared with that function.
more …
By
Calvo, Hiram; Inui, Kentaro; Matsumoto, Yuji
1 Citations
In this paper we present a comparison of two language models based on dependency triples. We explore using the verb only for predicting the most plausible argument as in selectional preferences, as well as using both the verb and argument for predicting another argument. This latter causes a problem of data sparseness that must be solved by different techniques for data smoothing. Based on our results on the KNearest Neighbor model (KNN) algorithm we conclude that adding more information is useful for attaining higher precision, while the PLSI model was inconveniently sensitive to this information, yielding better results for the simpler model (using the verb only). Our results suggest that combining the strengths of both algorithms would provide best results.
more …
