Showing 1 to 100 of 334 matching Articles
Results per page:
Export (CSV)
By
FazHernández, Armando; Longa, Patrick; Sánchez, Ana H.
23 Citations
We propose efficient algorithms and formulas that improve the performance of sidechannel protected scalar multiplication exploiting the GallantLambertVanstone (CRYPTO 2001) and GalbraithLinScott (EUROCRYPT 2009) methods. Firstly, by adapting Feng et al.’s recoding to the GLV setting, we derive new regular algorithms for variablebase scalar multiplication that offer protection against simple sidechannel and timing attacks. Secondly, we propose an efficient technique that interleaves ARMbased and NEONbased multiprecision operations over an extension field, as typically found on GLS curves and pairing computations, to improve performance on modern ARM processors. Finally, we showcase the efficiency of the proposed techniques by implementing a stateoftheart GLVGLS curve in twisted Edwards form defined over
$\mathbb{F}_{p^2}$
, which supports a four dimensional decomposition of the scalar and runs in constant time, i.e., it is fully protected against timing attacks. For instance, using a precomputed table of only 512 bytes, we compute a variablebase scalar multiplication in 92,000 cycles on an Intel Ivy Bridge processor and in 244,000 cycles on an ARM CortexA15 processor. Our benchmark results and the proposed techniques contribute to the improvement of the stateoftheart performance of elliptic curve computations. Most notably, our techniques allow us to reduce the cost of adding protection against timing attacks in the GLVbased variablebase scalar multiplication computation to below 10%.
more …
By
Adj, Gora; Menezes, Alfred; Oliveira, Thomaz; RodríguezHenríquez, Francisco
Show all (4)
10 Citations
In 2013, Joux, and then Barbulescu, Gaudry, Joux and Thomé, presented new algorithms for computing discrete logarithms in finite fields of small and medium characteristic. We show that these new algorithms render the finite field
${\mathbb{F}}_{3^{6 \cdot 509}} = {\mathbb{F}}_{3^{3054}}$
weak for discrete logarithm cryptography in the sense that discrete logarithms in this field can be computed significantly faster than with the previous fastest algorithms. Our concrete analysis shows that the supersingular elliptic curve over
${\mathbb{F}}_{3^{509}}$
with embedding degree 6 that had been considered for implementing pairingbased cryptosystems at the 128bit security level in fact provides only a significantly lower level of security. Our work provides a convenient framework and tools for performing a concrete analysis of the new discrete logarithm algorithms and their variants.
more …
By
Oliveira, Thomaz; Aranha, Diego F.; López, Julio; RodríguezHenríquez, Francisco
Show all (4)
7 Citations
In this paper we introduce new methods for computing constanttime variablebase point multiplications over the GalbraithLinScott (GLS) and the Koblitz families of elliptic curves. Using a lefttoright doubleandadd and a righttoleft halveandadd Montgomery ladder over a GLS curve, we present some of the fastest timings yet reported in the literature for point multiplication. In addition, we combine these two procedures to compute a multicore protected scalar multiplication. Furthermore, we designed a novel regular
$$\tau $$
adic scalar expansion for Koblitz curves. As a result, using the regular recoding approach, we set the speed record for a singlecore constanttime point multiplication on standardized binary elliptic curves at the
$$128$$
bit security level.
more …
By
Conde, Rodolfo; Rajsbaum, Sergio
3 Citations
In the consensus task each process proposes a value, and all correct processes have to decide the same value. In addition, validity requires that the decided value is a proposed value. Afek, Gafni and Lieber (DISC’09) introduced the safeconsensus task, by weakening the validity requirement: if the first process to invoke the task returns before any other process invokes it, then it outputs its input; otherwise, when there is concurrency, the consensus output can be arbitrary, not even the input of any process. Surprisingly, they showed that safeconsensus is equivalent to consensus, in a system where any number of processes can crash (e.g., waitfree).
We show that safeconsensus is nevertheless a much weaker communication primitive, in the sense that any waitfree implementation of consensus requires
$\binom{n}{2}$
safeconsensus blackboxes, and this bound is tight. The lower bound proof uses connectivity arguments based on subgraphs of Johnson graphs. For the upper bound protocol that we present, we introduce the g2coalitionsconsensus task, which may be of independent interest. We work in an iterated model of computation, where the processes repeatedly: write their information to a (fresh) shared array, invoke safeconsensus boxes and snapshot the contents of the shared array.
more …
By
Dani, Varsha; King, Valerie; Movahedi, Mahnush; Saia, Jared
Show all (4)
6 Citations
We describe an asynchronous algorithm to solve secure multiparty computation (MPC) over n players, when strictly less than a
${1}\over{8}$
fraction of the players are controlled by a static adversary. For any function f over a field that can be computed by a circuit with m gates, our algorithm requires each player to send a number of field elements and perform an amount of computation that is
$\tilde{O}(\frac{m}{n} + \sqrt n)$
. This significantly improves over traditional algorithms, which require each player to both send a number of messages and perform computation that is Ω(nm).
Additionaly, we define the threshold counting problem and present a distributed algorithm to solve it in the asynchronous communication model. Our algorithm is load balanced, with computation, communication and latency complexity of O(logn), and may be of independent interest to other applications with a load balancing goal in mind.
more …
By
AlbaCabrera, Eduardo; IbarraFiallo, Julio; GodoyCalderon, Salvador; CervantesAlonso, Fernando
Show all (4)
1 Citations
The last few years have seen an important increase in research publications dealing with external typical testorfinding algorithms, while internal ones have been almost forgotten or modified to behave as external on the basis of their alleged poor performance. In this research we present a new internal typical testorfinding algorithm called YYC that incrementally calculates typical testors for the currently analized set of basic matrix rows by searching for compatible sets. The experimentally measured performance of this algorithm stands out favorably in problems where other external algorithms show very low performance. Also, a comparative analysis of its efficiency is done against some external typical testorfinding algorithms published during the last few years.
more …
By
Graff, Mario; GraffGuerrero, Ariel; CerdaJacobo, Jaime
5 Citations
There is great interest for the development of semantic genetic operators to improve the performance of genetic programming. Semantic genetic operators have traditionally been developed employing experimentally or theoreticallybased approaches. Our current work proposes a novel semantic crossover developed amid the two traditional approaches. Our proposed semantic crossover operator is based on the use of the derivative of the error propagated through the tree. This process decides the crossing point of the second parent. The results show that our procedure improves the performance of genetic programming on rational symbolic regression problems.
more …
By
Herlihy, Maurice; Rajsbaum, Sergio; Raynal, Michel; Stainer, Julien
Show all (4)
4 Citations
In a waitfree model any number of processes may crash. A process runs solo when it computes its local output without receiving any information from other processes, either because they crashed or they are too slow. While in waitfree sharedmemory models at most one process may run solo in an execution, any number of processes may have to run solo in an asynchronous waitfree messagepassing model.
This paper is on the computability power of models in which several processes may concurrently run solo. It first introduces a family of roundbased waitfree models, called the dsolo models, 1 ≤ d ≤ n, where up to d processes may run solo. The paper gives then a characterization of the colorless tasks that can be solved in each dsolo model. It also introduces the (d,ε)solo approximate agreement task, which generalizes εapproximate agreement, and proves that (d,ε)solo approximate agreement can be solved in the dsolo model, but cannot be solved in the (d + 1)solo model. The paper studies also the relation linking dset agreement and (d,ε)solo approximate agreement in asynchronous waitfree messagepassing systems.
These results establish for the first time a hierarchy of waitfree models that, while weaker than the basic read/write model, are nevertheless strong enough to solve nontrivial tasks.
more …
By
GarcíaValdez, Mario; Guervós, Juan Julián Merelo; Fernández de Vega, Francisco
In this paper the effect of node unavailability in algorithms using EvoSpace, a poolbased evolutionary algorithm, is assessed. EvoSpace is a framework for developing evolutionary algorithms (EAs) using heterogeneous and unreliable resources. It is based on Linda’s tuple space coordination model. The core elements of EvoSpace are a central repository for the evolving population and remote clients, here called EvoWorkers, which pull random samples of the population to perform on them the basic evolutionary processes (selection, variation and survival), once the work is done, the modified sample is pushed back to the central population. To address the problem of unreliable EvoWorkers, EvoSpace uses a simple reinsertion algorithm using copies of samples stored in a global queue which also prevents the starvation of the population pool. Using a benchmark problem from the PPeaks problem generator we have compared two approaches: (i) the reinsertion of previous individuals at the cost of keeping copies of each sample, and a common approach of other pool based EAs, (ii) inserting randomly generated individuals. We found that EvoSpace is fault tolerant to highly unreliable resources and also that the reinsertion algorithm is only needed when the population is near the point of starvation.
more …
By
RamírezCorona, Mallinali; Sucar, L. Enrique; Morales, Eduardo F.
1 Citations
Hierarchical Multilabel Classification (HMC) is the task of assigning a set of classes to a single instance with the peculiarity that the classes are ordered in a predefined structure. We propose a novel HMC method for tree and Directed Acyclic Graphs (DAG) hierarchies. Using the combined predictions of locals classifiers and a weighting scheme according to the level in the hierarchy, we select the “best” single path for tree hierarchies, and multiple paths for DAG hierarchies. We developed a method that returns paths from the root down to a leaf node (Mandatory Leaf Node Prediction or MLNP) and an extension for Non Mandatory Leaf Node Prediction (NMLNP). For NMLNP we compared several pruning approaches varying the pruning direction, pruning time and pruning condition. Additionally, we propose a new evaluation metric for hierarchical classifiers, that avoids the bias of current measures which favor conservative approaches when using NMLNP. The proposed approach was experimentally evaluated with 10 tree and 8 DAG hierarchical datasets in the domain of protein function prediction. We concluded that our method works better for deep, DAG hierarchies and in general NMLNP improves MLNP.
more …
By
Aguilar, José Alfonso; Zaldívar, Anibal; Tripp, Carolina; Misra, Sanjay; Sánchez, Salvador; Martínez, Miguel; García, Omar Vicente
Show all (7)
1 Citations
In Web Engineering (WE), several Goaloriented Requirements Engineering (GORE) approaches have emerged using its advantages, such as the representation of actors, their intentions, goals and the tasks needed to achieve the goal, for requirements specification with promising results. Regrettably, the use of GORE approaches has one, among others, gap detected, the scalability. In these modeling frameworks, when the designer performs the requirements specification, the requirements diagram (model) trends to rapidly grow, becoming very difficult to use in projects with a considerable amount of requirements changing and growing constantly. In this paper, we propose an association form for the i* goaloriented modeling framework in order to define the creation of two type of modules: Navigational and Service modules, since these are the two types of functional requirements more used for requirements specification in our proposal. Furthermore, we provide an example of application. Finally, with this approach, the benefits are: firstly, the scalability of the Web requirements model will be increased, therefore the model will be less complex and easier to understand and maintain, and secondly, the construction of modeling tools improving the user experience, the maintainability of the models and its reuse.
more …
By
Fraigniaud, Pierre; Gafni, Eli; Rajsbaum, Sergio; Roy, Matthieu
Show all (4)
2 Citations
The state machine approach is a wellknown technique for building distributed services requiring high performance and high availability, by replicating servers, and by coordinating client interactions with server replicas using consensus. Indulgent consensus algorithms exist for realistic eventually partially synchronous models, that never violate safety and guarantee liveness once the system becomes synchronous. Unavoidably, these algorithms may never terminate, even when no processor crashes, if the system never becomes synchronous.
This paper proposes a mechanism similar to state machine replication, called RCsimulation, that can always make progress, even if the system is never synchronous. Using RCsimulation, the quality of the service will adjust to the current level of asynchrony of the network — degrading when the system is very asynchronous, and improving when the system becomes more synchronous. RCsimulation generalizes the state machine approach in the following sense: when the system is asynchronous, the system behaves as if k + 1 threads were running concurrently, where k is a function of the asynchrony.
In order to illustrate how the RCsimulation can be used, we describe a longlived renaming implementation. By reducing the concurrency down to the asynchrony of the system, RCsimulation enables to obtain renaming quality that adapts linearly to the asynchrony.
more …
By
Ritter, Gerhard X.; Urcid, Gonzalo
This paper presents an overview of the current status of lattice based dendritic computing. Roughly speaking, lattice based dendritic computing refers to a biomimetic approach to artificial neural networks whose computational aspects are based on lattice group operations. We begin our presentation by discussing some important processes of biological neurons followed by a biomimetic model which implements these processes. We discuss the reasons and rationale behind this approach and illustrate the methodology with some examples. Global activities in this field as well as some potential research issues are also part of this discussion.
more …
By
Fraigniaud, Pierre; Rajsbaum, Sergio; Roy, Matthieu; Travers, Corentin
Show all (4)
1 Citations
This paper carries on the effort to bridging runtime verification with distributed computability, studying necessary conditions for monitoring failure prone asynchronous distributed systems. It has been recently proved that there are correctness properties that require a large number of opinions to be monitored, an opinion being of the form true, false, perhaps, probably true, probably no, etc. The main outcome of this paper is to show that this large number of opinions is not an artifact induced by the existence of artificial constructions. Instead, monitoring an important class of properties, requiring processes to produce at most k different values does require such a large number of opinions. Specifically, our main result is a proof that it is impossible to monitor ksetagreement in an nprocess system with fewer than min {2k,n} + 1 opinions. We also provide an algorithm to monitor ksetagreement with min {2k,n} + 1 opinions, showing that the lower bound is tight.
more …
By
Batyrshin, Ildar; Solovyev, Valery
2 Citations
The paper introduces new time series shape association measures based on Euclidean distance. The method of analysis of associations between time series based on separate analysis of positively and negatively associated local trends is discussed. The examples of application of the proposed measures and methods to analysis of associations between historical prices of securities obtained from Google Finance are considered. An example of time series with inverse associations between them is discussed.
more …
By
Fraigniaud, Pierre; Rajsbaum, Sergio; Travers, Corentin
11 Citations
Decentralized runtime monitoring involves a set of monitors observing the behavior of system executions with respect to some correctness property. It is generally assumed that, as soon as a violation of the property is revealed by any of the monitors at runtime, some recovery code can be executed for bringing the system back to a legal state. This implicitly assumes that each monitor produces a binary opinion, true or false, and that the recovery code is launched as soon as one of these opinions is equal to false. In this paper, we formally prove that, in a failureprone asynchronous computing model, there are correctness properties for which there is no such decentralized monitoring. We show that there exist some properties which, in order to be monitored in a waitfree decentralized manner, inherently require that the monitors produce a number of opinions larger than two. More specifically, our main result is that, for every k, 1 ≤ k ≤ n, there exists a property that requires at least k opinions to be monitored by n monitors. We also present a corresponding distributed monitor using at most k + 1 opinions, showing that our lower bound is nearly tight.
more …
By
Caballero, Leydi; Moreno, Ana M.; Seffah, Ahmed
4 Citations
Human centricity refers to the active involvement in the overall product lifecycle of different human actors including endusers, stakeholders and providers. Persona is one of the different tools that exist for human centricity. While marketing is the original domain in which persona was introduced, this technique has also been widely used in usercentered design (UCD) design. In these two perceptions, persona has demonstrated its potential as an efficient tool for grouping the users or customers and focusing on user or customer needs, goals and behavior. A segmentation technique is generally used with persona in order to group individual users according to their common features, identifying within these groups those that represent a pattern of human behavior. This paper investigates how persona has been used to improve the usability in the agile development domain, while studying which contributions from marketing and HCI have enriched persona in this agile context.
more …
By
GarciaCeja, Enrique; Osmani, Venet; Maxhuni, Alban; Mayora, Oscar
Show all (4)
1 Citations
Social interactions play an important role in the overall wellbeing. Current practice of monitoring social interactions through questionnaires and surveys is inadequate due to recall bias, memory dependence and high enduser effort. However, sensing capabilities of smartphones can play a significant role in automatic detection of social interactions. In this paper, we describe our method of detecting interactions between people, specifically focusing on interactions that occur in synchrony, such as walking. Walking together between subjects is an important aspect of social activity and thus can be used to provide a better insight into social interaction patterns. For this work, we rely on sampling smartphone accelerometer and WiFi sensors only. We analyse WiFi and accelerometer data separately and combine them to detect walking in synchrony. The results show that from seven days of monitoring using seven subjects in reallife setting, we achieve 99% accuracy, 77.2% precision and 90.2% recall detection rates when combining both modalities.
more …
By
KuriMorales, Angel; CortésArce, Iván
Computer Networks are usually balanced appealing to personal experience and heuristics, without taking advantage of the behavioral patterns embedded in their operation. In this work we report the application of tools of computational intelligence to find such patterns and take advantage of them to improve the network’s performance. The traditional traffic flow for Computer Network is improved by the concatenated use of the following “tools”: a) Applying intelligent agents, b) Forecasting the traffic flow of the network via MultiLayer Perceptrons (MLP) and c) Optimizing the forecasted network’s parameters with a genetic algorithm. We discuss the implementation and experimentally show that every consecutive new tool introduced improves the behavior of the network. This incremental improvement can be explained from the characterization of the network’s dynamics as a set of emerging patterns in time.
more …
By
Rubio, Juan Pablo Serrano; Aguirre, Arturo Hernández; Guzmán, Rafael Herrera
1 Citations
This paper introduces an Evolutionary Algorithm in Conformal Space (EACS) for global continuous optimization and its implementation by using Conformal Geometric Algebra (CGA). Two new geometric search operators are included in the design of the EACS: Inversion Search Operator (ISO) and Reflection Search Operator (RSO). The ISO computes the inverse points with respect to hyperspheres, and the RSO redistributes the individuals on the surface of the hypersphere. The nonlinear geometric nature of the ISO furnishes and enhances the search capability of the algorithm. The reproduction operators are described in the framework of the CGA. CGA provides a concise way to perform rigid euclidean transformations(rotations, translations, reflections) and inversions on hyperspheres. These transformations are easily computed by using the products of the CGA. The performance of the EACS is analyzed through a benchmark of 28 functions. Statistical tests show the competitive performance of EACS in comparison with current leading algorithms (PSO and DE).
more …
By
CoelloCoello, Carlos A.
In this paper, we provide a general introduction to the socalled multiobjective evolutionary algorithms, which are metaheuristic search techniques inspired on natural evolution that are able to deal with highly complex optimization problems having two or more objectives. In the first part of the paper, we provide some basic concepts necessary to make the paper selfcontained, as well as a short review of the most representative multiobjective evolutionary algorithms currently available in the specialized literature. After that, a short review of applications of these algorithms in pattern recognition is provided. The final part of the paper presents some possible future research paths in this area as well as our conclusions.
more …
By
Villalobos Castaldi, Fabiola M.; FelipeRiveron, Edgardo M.; Gómez, Ernesto Suaste
The retinal vascular network has many desirable characteristics as a basis for authentication, including uniqueness, stability, and permanence. In this paper, a new approach for retinal images features extraction and template coding is proposed. The use of the logarithmic spiral sampling grid in scanning and tracking the vascular network is the key to make this new approach simple, flexible and reliable. Experiments show that this approach can achieve the reduction of data dimensionality and of the required time to obtain the biometric code of the vascular network in a retinal image. The performed experiments demonstrated that the proposed verification system has an average accuracy of 95.0 – 98 %.
more …
By
GarcíaValdez, Mario; Trujillo, Leonardo; MereloGuérvos, Juan Julián; FernándezdeVega, Francisco
Show all (4)
5 Citations
Recently, several Poolbased Evolutionary Algorithms (PEAs) have been proposed, that asynchronously distribute an evolutionary search among heterogeneous devices, using controlled nodes and nodes outside the local network, through web browsers or cloud services. In PEAs, the population is stored in a shared pool, while distributed processes called workers execute the actual evolutionary search. This approach allows researchers to use low cost computational power that might not be available otherwise. On the other hand, it introduces the challenge of leveraging the computing power of heterogeneous and unreliable resources. The heterogeneity of the system suggests that using a heterogeneous parametrization might be a better option, so the goal of this work is to test such a scheme. In particular, this paper evaluates the strategy proposed by Gong and Fukunaga for the IslandModel, which assigns random control parameter values to each worker. Experiments were conducted to assess the viability of this strategy on poolbased EAs using benchmark problems and the EvoSpace framework. The results suggest that the approach can yield results which are competitive with other parametrization approaches, without requiring any form of experimental tuning.
more …
By
MenchacaMendez, Adriana; Coello Coello, Carlos A.
In this paper, we propose an approach that combines a modified version of the maximin fitness function and the hypervolume indicator for selecting individuals into a MultiObjective Evolutionary Algorithm (MOEA). Our proposed selection mechanism is incorporated into a MOEA which adopts the crossover and mutation operators of the Nondominated Sorting Genetic AlgorithmII (NSGAII), giving rise to the socalled “MaximinHypervolume MultiObjective Evolutionary Algorithm (MHMOEA)”. Our proposed MHMOEA is validated using standard test problems taken from the specialized literature, using from three to six objectives. Our results are compared with respect to those produced by MCMOEA (which is based on the maximin fitness function and a clustering technique), MOEA/D using Penalty Boundary Intersection (PBI), which is based on decomposition and iSMSEMOA (which is based on the hypervolume indicator). Our preliminary results indicate that our proposed MHMOEA is a good alternative to solve multiobjective optimization problems having both low dimensionality and high dimensionality in objective function space.
more …
By
Serrano, Juan Pablo; Hernández, Arturo; Herrera, Rafael
1 Citations
We apply the HyperConic Artificial Multilayer Perceptron (HCMLP) to color image segmentation, where we consider image segmentation as a classification problem distinguishing between foreground and background pixels. The HCMLP was designed by using the conic space and conformal geometric algebra. The neurons in the hidden layer contain a transfer function that defines a quadratic surface (spheres, ellipsoids, paraboloids and hyperboloids) by means of inner and outer products, and the neurons in the output layer contain a transfer function that decides whether a point is inside or outside a sphere. The Particle Swarm Optimization algorithm (PSO) is used to train the HCMLP. A benchmark of fifty images is used to evaluate the performance of the algorithm and compare our proposal against statistical methods which use copula gaussian functions.
more …
By
Sharmeen, Zahra; MartinezEnriquez, Ana Maria; Aslam, Muhammad; Syed, Afraz Zahra; Waheed, Talha
Show all (5)
2 Citations
Natural disasters cause devastation in the society due to unpredictable nature. Whether damage is minor or severe, emergency support should be provided within time. Multiagent systems have been proposed to efficiently cope with emergency situations. Lot of work has been done on maturing core functionality of these systems but little attention has been given to their user interface. The world is moving towards an era where humans and machines work together to complete complex tasks. Management of such emergent situations is improved by combining superior human intelligence with efficiency of multiagent systems. Our goal is to design and develop agents based interface that facilitates humans not only in operating the system but also in resource mobilization like ambulances, fire brigade, etc. to reduce life and property loss. This enhancement improves system adaptability and speeds up the relief operation by saving time of humanagent consumed in dealing with complex computer interface.
more …
By
Ledeneva, Yulia; GarcíaHernández, René Arnulfo; Gelbukh, Alexander
1 Citations
We suggest a new method for the task of extractive text summarization using graphbased ranking algorithms. The main idea of this paper is to rank Maximal Frequent Sequences (MFS) in order to identify the most important information in a text. MFS are considered as nodes of a graph in term selection step, and then are ranked in term weighting step using a graphbased algorithm. We show that the proposed method produces results superior to thestateoftheart methods; in addition, the best sentences were found with this method. We prove that MFS are better than other terms. Moreover, we show that the longer is MFS, the better are the results. If the stopwords are excluded, we lose the sense of MFS, and the results are worse. Other important aspect of this method is that it does not require deep linguistic knowledge, nor domain or language specific annotated corpora, which makes it highly portable to other domains, genres, and languages.
more …
By
MenchacaMendez, Adriana; Montero, Elizabeth; Riff, MaríaCristina; Coello, Carlos A. Coello
Show all (4)
2 Citations
In this paper, we study iSMSEMOA, a recently proposed approach that improves the wellknown S metric selection Evolutionary MultiObjective Algorithm (SMSEMOA). These two indicatorbased multiobjective evolutionary algorithms rely on hypervolume contributions to select individuals. Here, we propose to define a probability of using a randomly selected individual within the iSMSEMOA’s selection scheme. In order to calibrate the value of such probability, we use the EVOCA tuner. Our preliminary results indicate that we are able to save up to 33% of computations of the contribution to hypervolume with respect to the original iSMSEMOA, without any significant quality degradation in the solutions obtained. In fact, in some cases, the approach proposed here was even able to improve the quality of the solutions obtained by the original iSMSEMOA.
more …
By
LazoCortés, Manuel S.; MartínezTrinidad, José Fco.; CarrascoOchoa, J. A.
This paper studies the relationship between the two most common definitions of reduct. Although there are other definitions, almost all the literature published in the framework of the Theory of Rough Sets, uses one of the two definitions we study here. However, there is an ambiguity in the use of these definitions and often authors do not previously declare what definition they refer to. Moreover, there are no publications where the relation between these two definitions is widely discussed, just that is what this paper addresses. We enunciate and demonstrate several properties expressing relations between both definitions including some illustrative examples.
more …
By
Figueroa, Karina; Paredes, Rodrigo
3 Citations
The permutation based index has shown to be very effective in medium and high dimensional metric spaces, even in difficult problems such as solving reverse knearest neighbor queries. Nevertheless, currently there is no study about which are the desirable features one can ask to a permutant set, or how to select good permutants. Similar to the case of pivots, our experimental results show that, compared with a randomly chosen set, a good permutant set yields to fast query response or to reduce the amount of space used by the index. In this paper, we start by characterizing permutants and studying their predictive power; then we propose an effective heuristic to select a good set of permutant candidates. We also show empirical evidence that supports our technique.
more …
By
ArocheVillarruel, Argenis A.; CarrascoOchoa, J. A.; MartínezTrinidad, José Fco.; OlveraLópez, J. Arturo; PérezSuárez, Airel
Show all (5)
1 Citations
In this paper we present a study of the overlapping clustering algorithms OKM, WOKM and OKMED, which are extensions to the overlapping case of the well known Kmeans algorithm proposed for building partitions. Different to other previously reported comparisons, in our study we compare these algorithms using the external evaluation metric FBcubed which takes into account the overlapping among clusters and we contrast our results against those obtained by Fmeasure, a metric that does not take into account the overlapping among clusters and that has been previously used in another reported comparison.
more …
By
OlguinCarbajal, Mauricio; HerreraLozada, J. Carlos; ArellanoVerdejo, Javier; BarronFernandez, Ricardo; Taud, Hind
Show all (5)
This paper presents an empirical study of a micro Differential Evolution algorithm (microDE) performance versus a canonical Differential Evolution (DE) algorithm performance. MicroDE is a DE algorithm with reduced population and some other differences. This paper’s objective is to show that our microDE outperforms the canonical DE for large scale optimization problems by using a test bed consisting of 20 complex functions with high dimensionality for a performance comparison between the algorithms. The results show two important points; first, the relevance of an accurate set of the optimization algorithms parameters regarding the problem itself. Second, we demonstrate the superior performance of our microDE with respect to DE in 19 out 20 tested functions. In some functions, the difference is up to seven orders of magnitude. Also, we show that microDE is better statistically than a simple DE and an adjusted DE for high dimensionality. In several problems where DE is used, microDE is highly recommended, as it achieves better results and statistic behavior without much change in code.
more …
By
CortésBerrueco, Luis Enrique; Gershenson, Carlos; Stephens, Christopher R.
In this paper three computational models for the study of the evolution of cooperation under cultural propagation are studied: Kin Selection, Direct Reciprocity and Indirect Reciprocity. Two analyzes are reported, one comparing their behavior between them and a second one identifying the impact that different parameters have in the model dynamics. The results of these analyzes illustrate how game transitions may occur depending of some parameters within the models and also explain how agents adapt to these transitions by individually choosing their attachment to a cooperative attitude. These parameters regulate how cooperation can selforganize under different circumstances. The emergence of the evolution of cooperation as a result of the agent’s adapting processes is also discussed.
more …
By
Bravo, Maricela; Rodríguez, José; Reyes, Alejandro
1 Citations
Service Oriented Computing (SOC) has incrementally been adopted as the preferred programming paradigm for the development, integration and interoperation of large and complex information systems. However, despite its increasing popularity, the SOC has not achieved its full potential yet. This is mainly due to the lack of supporting tools to enrich and represent semantically Web service descriptions. This paper describes a solution approach for the automatic representation of Web service descriptions and their further semantic enrichment between operation names based on the calculation of four semantic similarity measures. The enrichment approach is accurate because the final decision is done through a voting scheme, in the case of inconsistent results, these are not asserted into the ontology. Experimentation shows that although few similarity relationships are found and asserted, they represent an important step towards the automatic discovery of information that was previously unknown.
more …
By
Lopez, Alejandro; Cienfuegos, Miguel; Ainseba, Bedreddine; Bendahmane, Mostafa
Show all (4)
1 Citations
In this paper we present a nearest neighbor particle swarm optimization (PSO) algorithm applied to the numerical analysis of the inverse problem in electrocardiography. A twostep algorithm is proposed based on the application of the modified PSO algorithm with the Tikhonov regularization method to calculate the potential distribution in the heart. The PSO improvements include the use of the neighborhood particles as a strategy to balance exploration and exploitation in order to prevent premature convergences and produce a better local search. In the literature the inverse problem in electrocardiography is solved using the minimum energy norm in a Tikhonov regularization scheme. Although this approach solves the system, the solution may not have a meaning in the physical sense. Comparing to the classical reconstruction, the twostep PSO algorithm improves the accuracy of the solution with respect to the original distribution. Finally, to validate our results, we create a distribution over the heart by using a model of electrical activity (Bidomain model) coupled with a volume conductor model for the torso. Then, using our method, we make the reconstruction of the potential distribution.
more …
By
Herzig, Andreas; PozosParra, Pilar; Schwarzentruber, François
2 Citations
We study syntactical merging operations that are defined semantically by means of the Hamming distance between valuations; more precisely, we investigate the Σsemantics, Gmaxsemantics and maxsemantics. We work with a logical language containing merging operators as connectives, as opposed to the metalanguage operations of the literature. We capture these merging operators as programs of Dynamic Logic of Propositional Assignments DLPA. This provides a syntactical characterisation of the three semantically defined merging operators, and a proof system for DLPA therefore also provides a proof system for these merging operators. We explain how PSPACE membership of the model checking and satisfiability problem of starfree DLPA can be extended to the variant of DLPA where symbolic disjunctions that are parametrised by sets (that are not defined as abbreviations, but are proper connectives) are built into the language. As our merging operators can be polynomially embedded into this variant of DLPA, we obtain that both the model checking and the satisfiability problem of a formula containing possibly nested merging operators is in PSPACE.
more …
By
Domínguez, A. Rojas; LaraAlvarez, Carlos; Bayro, Eduardo
2 Citations
A novel method for automated identification of banknotes’ denominations based on image processing is presented. The method is part of a wearable aiding system for the visually impaired, and it uses a standard video camera as the image collecting device. The method first extracts points of interest from the denomination region of a banknote and then performs an analysis of the geometrical patterns so defined, which allows the identification of the banknote denomination. Experiments were performed with a testsubject in order to simulate realworld operating conditions. A high average identification rate was achieved.
more …
By
Lara–Alvarez, Carlos; Rojas, Alfonso; Bayro–Corrochano, Eduardo
The Place Recognition (PR) problem is fundamental for real time applications such as mobile robots (e.g. to detect loop closures) and guidance systems for the visually impaired. The Bag of Words (BoW) is a conventional approach that calculates a histogram of frequencies. One of the disadvantages of the BoW representation is that it loses information about the spatial location of features in the image. In this paper we study an approximate index based on the classic q–gram paradigm to recover images. Similar to the BoW, our approach detects interest points and assigns labels. Each image is represented by a set of q–grams obtained from triangles of a Delaunay decomposition. This representation allows us to create an index and to recover images efficiently. The proposed approach is path independent and was tested with a publicly available dataset showing a high recall rate and reduced time complexity.
more …
By
GarciaCeja, Enrique; Brena, Ramon; GalvánTejada, Carlos E.
1 Citations
Most of the previous works in hand gesture recognition focus in increasing the accuracy and robustness of the systems, however little has been done to understand the context in which the gestures are performed, i.e, the same gesture could mean different things depending on the context and situation. Understanding the context may help to build more userfriendly and interactive systems. In this work, we used location information in order to contextualize the gestures. The system constantly identifies the location of the user so when he/she performs a gesture the system can perform an action based on this information.
more …
By
Zapotecas Martínez, Saúl; Sosa Hernández, Víctor A.; Aguirre, Hernán; Tanaka, Kiyoshi; Coello Coello, Carlos A.
Show all (5)
7 Citations
The design of selection mechanisms based on quality assessment indicators has become one of the main research topics in the development of MultiObjective Evolutionary Algorithms (MOEAs). Currently, most indicatorbased MOEAs have employed the hypervolume indicator as their selection mechanism in the search process. However, hypervolumebased MOEAs become inefficient (and eventually, unaffordable) as the number of objectives increases. In this paper, we study the construction of a reference set from a family of curves. Such reference set is used together with a performance indicator (namely Δ_{p}) to assess the quality of solutions in the evolutionary process of an MOEA. We show that our proposed approach is able to deal (in an efficient way) with problems having many objectives (up to ten objective functions). Our preliminary results indicate that our proposed approach is highly competitive with respect to two stateoftheart MOEAs over the set of test problems that were adopted in our comparative study.
more …
By
RosalesPérez, Alejandro; Gonzalez, Jesus A.; CoelloCoello, Carlos A.; ReyesGarcia, Carlos A.; Escalante, Hugo Jair
Show all (5)
2 Citations
This paper introduces EMOPG+FS, a novel approach to prototype generation and feature selection that explicitly minimizes the classification error rate, the number of prototypes, and the number of features. Under EMOPG+FS, prototypes are initialized from a subset of training instances, whose positions are adjusted through a multiobjective evolutionary algorithm. The optimization process aims to find a set of suitable solutions that represent the best possible tradeoffs among the considered criteria. Besides this, we also propose a strategy for selecting a single solution from the several that are generated during the multiobjective optimization process.We assess the performance of our proposed EMOPG+FS using a suite of benchmark data sets and we compare its results with respect to those obtained by other evolutionary and nonevolutionary techniques. Our experimental results indicate that our proposed approach is able to achieve highly competitive results.
more …
By
Arriaga, Julio G.; Sanchez, Hector; Hedley, Richard; Vallejo, Edgar E.; Taylor, Charles E.
Show all (5)
1 Citations
In this paper, we present a comparative study on the application of pattern recognition algorithms to the identification of bird individuals from their song. A collection of experiments on the supervised classification of Cassin’s Vireo individuals were conducted to identify the algorithm that produced the highest classification accuracy. Preliminary results indicated that Multinomial Naive Bayes produced excellent classification of bird individuals.
more …
By
Rangel, Eduardo; Alanís, Alma Y.; Ricalde, Luis J.; AranaDaniel, Nancy; LópezFranco, Carlos
Show all (5)
2 Citations
This paper deals with a novel training algorithm for a neural network architecture applied to solar radiation time series prediction. The proposed training algorithm is based in a novel bioinspired aging modelparticle swarm optimization (BAMPSO). The BAMPSO based algorithm is employed to update the synaptic weights of the neural network. The size of the regression vector is determined by means of the Cao methodology. The proposed structure captures efficiently the complex nature of the solar radiation time series. The proposed model is trained and tested using real data values for solar radiation.
more …
By
Aizenberg, Igor; Sheremetov, Leonid; VillaVargas, Luis
In this paper, we discuss the longterm time series forecasting using a Multilayer Neural Network with MultiValued Neurons (MLMVN). This is complexvalued neural network with a derivativefree backpropagation learning algorithm. We evaluate the proposed approach using a realworld data set describing the dynamic behavior of an oilfield asset located in the coastal swamps of the Gulf of Mexico. We show that MLMVN can be efficiently applied to univariate and multivariate multistep ahead prediction of reservoir dynamics. This paper is not only intended for proposing a novel model of forecasting but to study carefully several aspects of the application of ANN models to time series forecasting that could be of the interest for pattern recognition community.
more …
By
AltamiranoGómez, Gerardo E.; BayroCorrochano, Eduardo
3 Citations
In this paper, we introduce a novel geometric voting scheme that extends previous algorithms, like Hough transform and tensor voting, in order to tackle perceptual organization problems. Our approach is grounded in three methodologies: representation of information using Conformal Geometric Algebra, a local voting process, which introduce global perceptual considerations at low level, and a global voting process, which clusters salient geometric entities in the whole image. Since geometric algebra is the mathematical framework of our approach, our algorithm infers highlevel geometric representations from tokens that are perceptually salient in an image.
more …
By
FebrerHernández, José Kadir; HernándezPalancar, José; HernándezLeón, Raudel; FeregrinoUribe, Claudia
Show all (4)
In this paper, we propose a novel algorithm for mining frequent sequences, called SPaMiFTS (Sequential Pattern Mining based on Frequent TwoSequences). SPaMiFTS introduces a new data structure to store the frequent sequences, which together with a new pruning strategy to reduce the number of candidate sequences and a new heuristic to generate them, allows to increase the efficiency of the frequent sequence mining. The experimental results show that the SPaMiFTS algorithm has better performance than the main algorithms reported to discover frequent sequences.
more …
By
ChiPoot, Angel; MartinGonzalez, Anabel
Augmented reality (AR) is one of the latest technologies that have demonstrated to be an efficient tool to improve pedagogical techniques. In this work, we present preliminary results of ongoing research in the development of an augmented reality system to facilitate learning of Euclidean vectors properties in physics. The system aids the user to understand physical concepts, such as magnitude and direction, along with operations like addition, subtraction and cross product of vectors, by visualizing augmented virtual components merged in a userinteraction environment.
more …
By
CasarrubiasVargas, Heriberto; PetrilliBarceló, Alberto; BayroCorrochano, Eduardo
1 Citations
The understanding of scenes is a key aspect of computer vision. Edge detection helps us to understand more about the scene structure since the edges mark a clear distinction for a transition from one region with similar properties to another one. When the edges are obtained from changes in orientation, we can use them to find key planes and describe the scene. This paper describes a method for fast edge detection in RGBD images. The edge detection algorithm for depth images is based on the use of smooth constraints in orientation. We show experimental results that demonstrate the potential of the approach proposed for edge detection.
more …
By
MoralesXicohtencatl, Mildred; FloresPulido, Leticia; SánchezPérez, Carolina Rocío; CórdovaZamorano, Juan José
Show all (4)
Computation is applicable to any branch in order to improve performance in times of process and results improvement, this article is the demonstration of an automatic process applied to the area of astronomy. The classification of electromagnetic spectra by pattern recognition is based on an assembly composed of neighborhoodbased methods of classification. The acquisition of the electromagnetic spectrum to classify, is obtained of the SDSS III (Sloan Digital Sky Survey), the process of classification consists of a preprocessing, to obtain a specific region of the spectrum followed by filtering in advance of relevant features by means of digital signal processing and the wavelet haar transform. abstract environment.
more …
By
Gaxiola, Leopoldo N.; DíazRamírez, Víctor Hugo; Tapia, Juan J.; DiazRamirez, Arnoldo; Kober, Vitaly
Show all (5)
3 Citations
A face tracking algorithm based on locallyadaptive correlation filtering is proposed. The algorithm is capable to track a face with invariance to pose, gesticulations, occlusions and clutter. The target face is chosen at the beginning of the algorithm. Afterwards, a composite filter is designed to recognize the face in posterior frames. The filter is adapted online using information of current and past scene frames. The adaptive filter is constructed by combining several optimal templates designed for distortion invariant target recognition. Results obtained with the proposed algorithm using reallife scenes, are presented and compared with those obtained with a recent stateoftheart tracking method, in terms of detection efficiency, tracking accuracy, and speed of processing.
more …
By
Domínguez, Ignacio Segovia; Aguirre, Arturo Hernández; Valdez, S. Ivvan
2 Citations
This paper introduces the Gradientdriven Density Function (∇ _{d}D) approach, and its application to Estimation of Distribution Algorithms (EDAs). In order to compute the ∇ _{d}D, we also introduce the Expected Gradient Estimate (EGE), which is an estimation of the gradient, based on information from other individuals. Whilst EGE delivers an estimation of the gradient vector at the position of any individual, the ∇ _{d}D delivers a statistical model (e.g. the normal distribution) that allows the sampling of new individuals around the direction of the estimated gradient. Hence, in the proposed EDA, the gradient information is inherited to the new population. The computation of the EGE vector does not need additional function evaluations. It is worth noting that this paper focuses in blackbox optimization. The proposed EDA is tested with a benchmark of 10 problems. The statistical tests show a competitive performance of the proposal.
more …
By
Alba, Alfonso; ArceSantana, Edgar; AguilarPonce, Ruth M.; CamposDelgado, Daniel U.
Show all (4)
8 Citations
In computer vision and video encoding applications, one of the first and most important steps is to establish a pixeltopixel correspondence between two images of the same scene obtained at slightly different times or points of view. One of the most popular methods to find these correspondences, known as Area Matching, consists in performing a computationally intensive search for each pixel in the first image, around a neighborhood of the same pixel in the second image. In this work we propose a method which significantly reduces the search space to only a few candidates, and permits the implementation of realtime vision and video encoding algorithms which do not require specialized hardware such as GPU’s or FPGA’s. Theoretical and experimental support for this method is provided. Specifically, we present results from the application of the method to the realtime video compression and transmission, as well as the realtime estimation of dense optical flow and stereo disparity maps, where a basic implementation achieves up to 100 fps in a typical dualcore PC.
more …
By
RamírezdelaRosa, Gabriela; VillatoroTello, Esaú; JiménezSalazar, Héctor; SánchezSánchez, Christian
Show all (4)
5 Citations
Online communities are filled with comments of loyal readers or firsttime viewers, that are constantly creating and sharing information at an unprecedented level, resulting in millions of messages containing opinions, ideas, needs and beliefs of Internet users. Therefore, businesses companies are very interested in finding influential users and encouraging them to create positive influence. Influential users represent users with the ability to influence individual’s attitudes in a desired way with relative frequency. We present an empirical analysis on influential users identification problem in Twitter. Our proposed approach considers that the influential level of users can be detected by considering its communication patterns, by means of particular writing style features as well as behavioral features. Performed experiments on more that 7000 users profiles, indicate that it is possible to automatically identify influential users among the members of a social networking community, and also it obtains competitive results against several stateoftheart methods.
more …
By
Morán, Alberto L.; Meza, Victoria; RamírezFernández, Cristina; Grimaldo, Ana I.; GarcíaCanseco, Eloísa; OrihuelaEspina, Felipe; Sucar, Luis Enrique
Show all (7)
4 Citations
We report the results of an indirect observation usability and user experience (UX) study on the use of the Gesture Therapy (GT) rehabilitation platform, as a physical activation and cognitive stimulation tool for the elderly. The results from this study complement those of a former selfreport study [8]. Elders perceived the system with high usefulness, usability, and UX, as well as generating low anxiety in both studies. Also, the results allowed us to analyze and evaluate the impact of elders’ previous experience on computer use on specific aspects. Interestingly, the significance of the effect of previous computer use experience on perceived anxiety and perceived enjoyment aspects of UX was different in both studies, although there is an important overlap for ease of use factors. These results, although not conclusive yet on the causes for the difference, provides us with further evidence to establish that elders’ previous experience (or not) on computer use affects their user experience on the use of the GT platform.
more …
By
Guzmán, Luis F.; CamarenaIbarrola, Antonio
Audio Following (AF) is the process of mapping a musician’s performance, usually in realtime, to a reference performance that is used as a reference. Such base performance is considered a “correct performance” and thus, the live performace must be aligned to it. The objective of AF is to track the musician’s position throughout the performance. We present a novel approach to AF that uses a locality sensitive hashing (LSH) based index to perform such task. First, we obtain the Audio Fingerprint (AFP) of the base performance. Then, the obtained AFP is indexed using LSH. Such performance’s AFP is used as a reference to align any other performance of the same music. Next, we obtain halfasecond subAFP’s of the performance being followed and their corresponding positions in the reference AFP are searched for by querying the index. The system was tested on a set of 22 pianists playing music by Chopin with very good results when comparing the obtained alignment with the ideal alignment.
more …
By
Finn, Chelsea; Duyck, James; Hutcheon, Andy; Vera, Pablo; Salas, Joaquin; Ravela, Sai
Show all (6)
3 Citations
The characterization of individual animal life history is crucial for conservation efforts. In this paper, Sloop, an operational pattern retrieval engine for animal identification, is extended by coupling crowdsourcing with image retrieval. The coupled system delivers scalable performance by using aggregated computational inference to effectively deliver precision and by using human feedback to efficiently improve recall. To the best of our knowledge, this is the first coupled humanmachine animal biometrics system, and results on multiple species indicate that it is a promising approach for largescale use.
more …
By
Santoyo, Francisco; Chávez, Edgar; Téllez, Eric S.
Some instances of multimedia data can be represented as high dimensional binary vectors under the hamming distance. The standard index used to handle queries is Locality Sensitive Hashing (LSH), reducing approximate queries to a set of exact searches. When the queries are not selective and multiple families of hashing functions are employed, or when the collection is large, LSH indexes should be stored in secondary memory, slowing down the query time.
In this paper we present a compressed LSH index, queryable without decompression and with negligible impact in query speed. This compressed representation enables larger collections to be handled in main memory with the corresponding speedup with respect to fetching data from secondary memory.
We tested the index with a real world example, indexing songs to detect near duplicates. Songs are represented using an entropy based audiofingerprint (AFP), of independent interest.
The combination of compressed LSH and the AFP enables the retrieval of lossy compressed audio with near perfect recall at bitrates as low as 32 kbps, packing the representation of 30+ million music tracks of standard length (which is about the total number of unique tracks of music available worldwide) in half a gigabyte of space. A sequential search for matches would take about 15 minutes; while using our compressed index, of size roughly one gigabyte, searching for a song would take a fraction of a second.
more …
By
Urzaiz, Gabriel; Hervas, Ramón; Fontecha, Jesús; Bravo, José
Show all (4)
2 Citations
This short paper presents a highlevel model aimed to obtain uniform functionality and performance of the services provided within a healthy smart city. The model is based on a threelayer architecture and uses an overlay network to provide enhanced network and semantic functionality. This is a novel approach that incorporates the use of healthrelated data coming from different information sources to provide smart health services along the city.
more …
By
CamachoBello, C.; BáezRojas, J. J.
1 Citations
We present a new approach for angle estimation in binary images from Hahn moments, which provide an approximate estimate with short computational times. The method proposed retrieving the angle formed from a reference point to another, through a multiple linear regression and a set of Hahn moments obtained in a training database. Finally, we discuss the performance analysis of our approach under noise conditions and scale change.
more …
By
Terven, Juan R.; Salas, Joaquin; Raducanu, Bogdan
3 Citations
This paper presents a system capable of recognizing six head gestures: nodding, shaking, turning right, turning left, looking up, and looking down. The main difference of our system compared to other methods is that the Hidden Markov Models presented in this paper, are fully connected and consider all possible states in any given order, providing the following advantages to the system: (1) allows unconstrained movement of the head and (2) it can be easily integrated into a wearable device (e.g. glasses, neckhung devices), in which case it can robustly recognize gestures in the presence of egomotion. Experimental results show that this approach outperforms common methods that use restricted HMMs for each gesture.
more …
By
Camiña, J. Benito; Rodríguez, Jorge; Monroy, Raúl
2 Citations
Nowadays, computers store critical information, prompting the development of mechanisms aimed to timely detect any kind of intrusion. Some of such mechanisms, called masquerade detectors, are often designed to signal an alarm whenever they detect an anomaly in system behavior. Usually, the profile of ordinary system behavior is built out of a history of command execution. However, in [1,2], we suggested that it is not a command, but the object upon which it is carried out what may distinguish a masquerade from user participation; also, we hypothesized that this approach provides a means for building masquerade detectors that work at a higherlevel of abstraction. In this paper, we report on a successful step towards this hypothesis validation. The crux of our abstraction stems from that a directory often holds closely related objects, resembling a user task; thus, we do not have to account for the accesses to individual objects; instead, we simply take it to be an access to some ancestor directory of it, the user task. Indeed, we shall prove that by looking into the access to only a few such user tasks, we can build a masquerade detector, just as powerful as if we looked into the access to every single file system object. The advantages of this abstraction are paramount: it eases the construction and maintenance of a masquerade detection mechanism, as it yields much shorter models. Using the WUIL dataset [2], we have conducted two experiments for distinguishing the performance of two oneclass classifiers, namely: Naïve Bayes and Markov chains, considering single objects and our abstraction to user tasks. We shall see that in both cases, the taskbased masquerader detector outperforms the individual objectbased one.
more …
By
AranaDaniel, Nancy; Villaseñor, Carlos; LópezFranco, Carlos; Alanís, Alma Y.
Show all (4)
1 Citations
On computer vision field Structure from Motion (SfM) algorithms offer good advantages for numerous applications (augmented reality, autonomous navigation, motion capture, remote sensing, object recognition, imagebase 3D modeling, among others), nevertheless, these algorithms show some weakness; in the present paper we propose the use of Bioinspired Aging ModelPSO (BAMPSO) to improve the accuracy of SfM algorithms. The BAMPSO algorithm is used over a Geometric Algebra (GA) framework in order to compute the rigid movement on images and this allows us to obtain a numerically stable algorithm.
more …
By
SandovalAlmazan, Rodrigo; GilGarcia, J. Ramon
8 Citations
Open government implementation connects to several actions: public policy design, software implementation, website development, policy informatics, and the development of new regulations. Despite this important progress, very little has been done to measure the impact of open government and provide feedback in terms of the next steps for implementation. Furthermore, very few models intend to explain the functions, characteristics, or the future of this new trend toward openness. Our research from 2006 to 2012 uses a multicomponent model to measure open government websites in the 32 Mexican state governments. However, the website model could become obsolete as a result of technology advancements. This paper analyzes some knowledge gaps and potential problems with this type of model and proposes a new approach to open government portals based on four conceptual pillars: wikinomics, open data, new institutionalism, and the fifth state (Network State).
more …
By
UndaTrillas, Emilio; RiveraRovelo, Jorge
1 Citations
An adaptable structure to build a classification tree is presented. From such structure different existing classification trees can be obtained, but also we can build new ones, and compare the results of different trees (classification error, tree size, number of levels or other defined criteria). We use the adaptable scheme to emulate ID3, C4.5 and M5 trees, but also create a new tree (called general tree), and results obtained shows that we can obtain the same results with the original trees, and for the case of the general tree, its results are very close to the better classifier tree of the three studied.
more …
By
LópezOrnelas, Erick; AbascalMena, Rocío; ZepedaHernández, J. Sergio
1 Citations
Geocollaboration appears when individuals or groups work together to solve spatial decisionmaking problems facilitated by geospatial information technologies. In this paper we focus on current developments in geocollaboration to help urban mobility. This work shows a collaborative mobile prototype that help people to take some decisions and share knowledge from a city. The main prototype recommends a better route to users in order to promote “walkability”, in our case, Mexico City. The system not only takes on account the user profile, but the time, the date, the recommendation of other users and their spatial activity in order to give the best route.
more …
By
MartinezEnriquez, A. M.; EscaladaImaz, G.; Muhammad, Aslam
This paper describes how to solve numerical problems by a non computer user based on Knowledge Based Systems (KBSs) principles. The aims of our approach is to handle numerical information of the problem, by using Backward Logical inferences, deducing whether or not the required mathematical composition functions exist. We implemented a backward inference algorithm which is bounded by O(n), n being the size of KBS. Moreover the inference engine proceeds in a topdown way, so scans a small reduced search space compared to those of the forward chaining algorithms. Our experimental example deals with some statistical analysis and its application to clustering and concept formation.
more …
By
Chávez, Edgar; Ludueña, Verónica; Reyes, Nora; Roggero, Patricia
Show all (4)
4 Citations
In this paper we present the Distal Spatial Approximation Tree (DiSAT), an algorithmic improvement of SAT. Our improvement increases the discarding power of the SAT by selecting distal nodes instead of the proximal nodes proposed in the original paper. Our approach is parameter free and it was the most competitive in an extensive benchmarking, from two to forty times faster than the SAT, and faster than the List of Clusters (LC) which is considered the state of the art for main memory, linear sized indexes in the model of distance computations.
In summary, we obtained an index more resistant to the curse of dimensionality, establishing a new benchmark in performance, faster to build than the LC and with a small memory footprint. Our strategies can be used in any version of the SAT, either in main or secondary memory.
more …
By
MartínezAngeles, Carlos Alberto; Dutra, Inês; Costa, Vítor Santos; BuenabadChávez, Jorge
Show all (4)
3 Citations
We present the design and evaluation of a Datalog engine for execution in Graphics Processing Units (GPUs). The engine evaluates recursive and nonrecursive Datalog queries using a bottomup approach based on typical relational operators. It includes a memory management scheme that automatically swaps data between memory in the host platform (a multicore) and memory in the GPU in order to reduce the number of memory transfers. To evaluate the performance of the engine, four Datalog queries were run on the engine and on a single CPU in the multicore host. One query runs up to 200 times faster on the (GPU) engine than on the CPU.
more …
By
Garro, Beatriz Aurora; Vazquez, Roberto Antonio; Rodríguez, Katya
4 Citations
DNA microarrays are a powerful technique in genetic science due to the possibility to analyze the gene expression level of millions of genes at the same time. Using this technique, it is possible to diagnose diseases, identify tumours, select the best treatment to resist illness, detect mutations and prognosis purpose. However, the main problem that arises when DNA microarrays are analyzed with computational intelligent techniques is that the number of genes is too big and the samples are too few. For these reason, it is necessary to apply preprocessing techniques to reduce the dimensionality of DNA microarrays. In this paper, we propose a methodology to select the best set of genes that allow classifying the disease class of a gene expression with a good accuracy using Artificial Bee Colony (ABC) algorithm and distance classifiers. The results are compared against Principal Component Analysis (PCA) technique and others from the literature.
more …
By
Cornejo, Raymundo; Hernandez, Daniel; Tentori, Monica; Favela, Jesus
Show all (4)
1 Citations
Natural interfaces are facilitating the adoption of videogames by older adults, promoting the development of serious games aimed at encouraging healthy behaviors in this population. In this paper we present the design and evaluation of an ambient game, GuessMyCaption, aimed at enhancing the social networks of older adults, known to have an impact in their wellbeing. GuessMyCaption was deployed during a 5weeks study in the home of one older adult and twelve relatives. The results demonstrate GuessMyCaption is easy to use and maintains an older adult engaged with exercises while offering new opportunities for online and offline socialization. GuessMyCaption had a positive impact in the perceived wellbeing of the older adult improving her perception on her cognitive skills and physical health, and catalyzing socialization. This research shows that the use of natural interfaces and family memorabilia facilitate the adoption of serious games, improves older adults’ perceived wellbeing, and encourage socialization.
more …
By
LazoCortés, Manuel S.; MartínezTrinidad, José Fco.; CarrascoOchoa, Jesús Ariel; SanchezDiaz, Guillermo
Show all (4)
1 Citations
This paper deals with the relation between rough set reducts and typical testors from the logical combinatorial approach to pattern recognition. The main objective is to clarify once and for all that although in many cases the two concepts coincide, being rigorous they are not the same. Definitions, comments and observations are formally introduced and supported by illustrative examples. Furthermore, some theorems expressing theoretical relations between reducts and typical testors are enunciated and proved.
more …
By
Trujillo, Leonardo; Muñoz, Luis; Naredo, Enrique; Martínez, Yuliana
Show all (4)
3 Citations
The Operator Equalization (OE) family of bloat control methods have achieved promising results in many domains. In particular, the FlatOE method, that promotes a flat distribution of program sizes, is one of the simplest OE methods and achieves some of the best results. However, FlatOE, like all OE variants, can be computationally expensive. This work proposes a simplified strategy for bloat control based on FlatOE. In particular, bloat is studied in the NeuroEvolution of Augmenting Topologies (NEAT) algorithm. NEAT includes a very simple diversity preservation technique based on speciation and fitness sharing, and it is hypothesized that with some minor tuning, speciation in NEAT can promote a flat distribution of program size. Results indicate that this is the case in two benchmark problems, in accordance with results for FlatOE. In conclusion, NEAT provides a worthwhile strategy that could be extrapolated to other GP systems, for effective and simple bloat control.
more …
By
Villarreal, B. Lorena; Olague, Gustavo; Gordillo, J. L.
2 Citations
Smell sensors in mobile robotics for odor source localization are getting the attention for researches around the world. To solve the problem, it must be considered the environmental model and odor behavior, the perception system and the algorithm for tracking the odors plume. Current algorithms try to emulate the behavior of the animals known by its capability to follow odors. Nevertheless, the odor perception systems are still in its infancy and far to be compared with the biological smell sense. This is why, an algorithm that considers the perception system capabilities and drawbacks, the environmental model and the odor behavior is presented on this work. Besides, an artificial intelligent technique (Genetic Programming) is used as a platform to develop odor source localization algorithms. It is prepared for different environment conditions and perception systems. A comparison between this improved algorithm and a pair of basic techiques for odor source localization is presented in terms of repeatability.
more …
By
Poria, Soujanya; Agarwal, Basant; Gelbukh, Alexander; Hussain, Amir; Howard, Newton
Show all (5)
15 Citations
Conceptlevel text analysis is superior to wordlevel analysis as it preserves the semantics associated with multiword expressions. It offers a better understanding of text and helps to significantly increase the accuracy of many text mining tasks. Concept extraction from text is a key step in conceptlevel text analysis. In this paper, we propose a ConceptNetbased semantic parser that deconstructs natural language text into concepts based on the dependency relation between clauses. Our approach is domainindependent and is able to extract concepts from heterogeneous text. Through this parsing technique, 92.21% accuracy was obtained on a dataset of 3,204 concepts. We also show experimental results on three different text analysis tasks, on which the proposed framework outperformed stateoftheart parsing techniques.
more …
By
González, Domingo Iván Rodríguez; Hayet, JeanBernard
In this article, we propose a new, fast approach to detect human beings from RGBD data, named Progressive Classification. The idea of this method is quite simple: As in several stateoftheart algorithms, the classification is based on the evaluation of HOGlike descriptors within image test windows, which are divided into a set of blocks. In our method, the evaluation of the set of blocks is done progressively in a particular order, in such a way that the blocks that most contribute to the separability between the human and nonhuman classes are evaluated first. This permits to make an early decision about the human detection without necessarily reaching the evaluation of all the blocks, and therefore accelerating the detection process. We evaluate our method with different HOGlike descriptors and on a challenging dataset.
more …
By
RomeroH., ReimerA.; ReneroC., FranciscoJ.
A methodology is proposed to determine the positions of the characteristic facial points. The method is based on the Point Distribution Model, which estimates the most likely positions points. The search is refined with a discriminator, acting locally, around each point of the model; this discriminant model was achieved by training a Support Vector Machine with vectorized images in their histograms of oriented gradient (HoG). Then we aproximate de points by Thin Plate Spline by using a lambda parameters whose values were adjusted according to the local search errors. Models achieved performances that reached 90% in a crossvalidation. Likewise, the strategy shows an error accumulated less than 10% in some characteristic points. The algorithm was evaluated with MUCT and BioID databases. This strategy would make digital morphological and anthropometric assessments to human face.
more …
By
Yee, Arturo; Rodríguez, Reinaldo; Alvarado, Matías
1 Citations
In this paper, the analysis of American football strategies is by applying Nash equilibrium. Up to the offensive or defensive teamrole, each player usually practices the relevant plays for his role; each play is qualified regarding the benefit that could add to the team success. The team’s strategies, that join the individual’s plays, are identified by means of the strategy profiles of a normal game formal setting of American football, and valued by the each player’s payoff function. Hence, the Nash equilibrium strategy profiles can be identified and used for the actions decision making in a match gaming.
more …
By
LópezGonzález, G.; AranaDaniel, Nancy; BayroCorrochano, Eduardo
1 Citations
This paper presents the Quaternion Support Vector Machines for classification as a generalization of the real and complex valued Support Vector Machines. In this framework we handle the design of kernels involving the Clifford or quaternion product. The QSVM allows to change the metric involved in the quaternion product. The application section shows experiments in pattern recognition and colour image processing.
more …
By
Chang, Leonardo; AriasEstrada, Miguel; HernándezPalancar, José; Sucar, L. Enrique
Show all (4)
Shape information have proven to be useful in many computer vision applications. In this work, a selfcontaining shape descriptor for open and closed contours is proposed. Also, a partial shape matching method robust to partial occlusion and noise in the contour is proposed. Both the shape descriptor and the matching method are invariant to rotation and translation. Experiments were carried out in the Shapes99 and Shapes216 datasets, where contour segments of different lengths were removed to obtain partial occlusion as high as 70%. For the highest occlusion levels the proposed method outperformed other popular shape description methods, with up to 50% higher bull’s eye score.
more …
By
Ruiz, Elías; Sucar, L. Enrique
A novel proposal for a compositional model for object recognition is presented. The proposed method is based on visual grammars and Bayesian networks. An object is modeled as a hierarchy of features and spatial relationships. The grammar is learned automatically from examples. This representation is automatically transformed into a Bayesian network. Thus, recognition is based on probabilistic inference in the Bayesian network representation. Preliminary results in recognition of natural objects are presented. The main contribution of this work is a general methodology for building object recognition systems which combines the expressivity of a grammar with the robustness of probabilistic inference.
more …
By
CamachoBello, C.; BáezRojas, J. J.
1 Citations
We present a novel method for gait phase detection based on Krawtchouk moments, which can be used in gait analysis. The low computational cost and high capacity of description of the Krauchouk moments makes it easy detect the parameters of the gait cycle, such as the swing phase, stance phase and double support. In addition, we present the results of the gait phases detection with the proposed method of 10 test subjects and compared with standard values.
more …
By
Tovar, Mireya; Pinto, David; Montes, Azucena; González, Gabriel; Vilariño, Darnes; Beltrán, Beatriz
Show all (6)
2 Citations
In this paper we present an approach for the evaluation of taxonomic relations of restricted domain ontologies. We use the evidence found in corpora associated to the ontology domain for determining the validity of the taxonomic relations. Our approach employs lexicosyntactic patterns for evaluating taxonomic relations in which the concepts are totally different, and it uses a particular technique based on subsumption for those relations in which one concept is completely included in the other one. The integration of these two techniques has allowed to automatically evaluate taxonomic relations for two ontologies of restricted domain. The performance obtained was about 70% for one ontology of the elearning domain, whereas we obtained around 88% for the ontology associated to the artificial intelligence domain.
more …
By
RodriguezMaya, Noel E.; Graff, Mario; Flores, Juan J.
1 Citations
Modelling the behaviour of algorithms is the realm of Evolutionary Algorithm theory. From a practitioner’s point of view, theory must provide some guidelines regarding which algorithm/parameters to use in order to solve a particular problem. Unfortunately, most theoretical models of evolutionary algorithms are difficult to apply to realistic situations. Recently, there have been works that addressed this problem by proposing models of performance of different Genetic Programming Systems. In this work, we complement previous approaches by proposing a scheme capable of classifying the hardness of optimization problems based on different difficulty measures such as Negative Slope Coefficient, Fitness Distance Correlation, Neutrality, Ruggedness, Basins of Attraction, and Epistasis. The results indicate that this procedure is able to accurately classify the performance of the GA over a set of benchmark problems.
more …
By
FloresGarrido, Marisol; CarrascoOchoa, Jesús Ariel; MartínezTrinidad, José Fco.
2 Citations
Graph pattern mining is an important task in Data Mining and several algorithms have been proposed to solve this problem. Most of them require that a pattern and its occurrences are identical, thus, they rely on solving the graph isomorphism problem. In the last years, however, some algorithms have focused in the case in which label and edge structure differences between a pattern and its occurrences are allowed but maintaining a bijection among vertices, using inexact matching during the mining process. Recently, an algorithm that allows structural differences in vertices was proposed. This feature allows it to find patterns missed by other algorithms, but, do these extra patterns actually contain useful information? We explore the answer to this question by performing an experiment in the context of unsupervised mining tasks. Our results suggests that by allowing structural differences in both, vertices and edges, it is possible to obtain new useful information.
more …
By
Almeida, Edwing; Ferruzca, Marco; Pilar Morales Tlapanco, María
1 Citations
One of the major diseases that afflict the elderly population in Mexico is depression. This document describes the process of designing a system for early detection and treatment of the state of depression in older adults, taking advantage of the technological development of the Internet of Things, the Context Awareness and the concept of eHealth to determine the Daily Activities living (ADL) using the gesture recognition log events to determine an abnormality in as a means to conclude the variations in the ADL.
more …
By
ChaconMurguia, Mario I.; VillalobosMontiel, Adrian J.; CalderonContreras, Jorge D.
2 Citations
The present work presents a methodology to automatically detect the symmetry point of breast. In order to achieve this goal, the algorithm corrects thermal image tilt to find a breast symmetry axis, compute a modified symmetric index that can be used as a measure of image quality, breast cosmetic and pathologic issues, and a seed location for a former algorithm reported to automatically achieve breast cancer analysis. The methodology involves filtering, edge detection, windowing edge analysis and shape detection based on the Hough transform. Experimental results show that the proposed method is able to define the symmetry axis as precise as a person do, and correctly detected breast areas in 100% of the cases considered, allowing automatic breast analysis by a previous algorithm.
more …
By
GarcíaBorroto, Milton; MartínezTrinidad, José Fco.; CarrascoOchoa, Jesús Ariel
18 Citations
Obtaining accurate class prediction of a query object is an important component of supervised classification. However, it could be also important to understand the classification in terms of the application domain, mostly if the prediction disagrees with the expected results. Many accurate classifiers are unable to explain their classification results in terms understandable by an application expert. Classifiers based on emerging patterns, on the other hand, are accurate and easy to understand. The goal of this article is to review the stateoftheart methods for mining emerging patterns, classify them by different taxonomies, and identify new trends. In this survey, we present the most important emerging pattern miners, categorizing them on the basis of the mining paradigm, the use of discretization, and the stage where the mining occurs. We provide detailed descriptions of the mining paradigms with their pros and cons, what helps researchers and users to select the appropriate algorithm for a given application.
more …
By
PinillaBuitrago, Laura Alejandra; MartínezTrinidad, José Fco.; CarrascoOchoa, J. A.
A Skeleton is a simplified and efficient descriptor for shapes, which is of great importance in computer graphics and vision. In this paper, we present a new method for computing skeletons from 2D binary shapes. The contour of each shape is represented by a set of dominant points, which are obtained by a nonparametric method. Then, a set of convex dominant points is used for building the skeleton. Finally, we iteratively remove some skeleton branches in order to get a clean skeleton representation. The proposed method is compared against other methods of the state of the art. The results show that the skeletons built by our method are more stable across a wider range of shapes than the skeletons obtained by other methods; and the shapes reconstructed from our skeletons are closer to the original shapes.
more …
By
CruzChávez, Marco Antonio; MartínezOropeza, Alina; Carmen PeraltaAbarca, Jesús; CruzRosales, Martín Heriberto; MartínezRangel, Martín
Show all (5)
A comparative analysis of several neighborhood structures is presented, including a variable neighborhood structure, which corresponds to a combination of the neighborhood structures evaluated in this paper. The performance of each neighborhood structure was tested using large random instances generated in this research and wellknown benchmarks such as the Classical Symmetric Traveling Salesman Problem and the Unrelated Parallel Machines Problem. Experimental results show differences in the performance of the variable neighborhood search when it is applied to problems with differing complexity. Contrary to reports in literature about variable neighborhood searches, its performance varies according to the complexity of the problem.
more …
By
Pogrebnyak, Oleksiy; HernándezBautista, Ignacio; Camacho Nieto, Oscar; Manrique Ramírez, Pablo
Show all (4)
A method for image lossless compression using lifting scheme wavelet transform is presented. The proposed method adjusts wavelet filter coefficients analyzing signal spectral characteristics to obtain a higher compression ratio in comparison to the standard CDF(2,2) and CDF(4,4) filters. The proposal is based on spectral pattern recognition with 1NN classifier. Spectral patterns of a small fixed length are formed for the entire image permitting thus the global optimization of the filter coefficients, equal for all decompositions. The proposed method was applied to a set of test images obtaining better results in entropy values in comparison to the standard wavelet lifting filters.
more …
By
Farzana, J.; Muhammad, Aslam; MartinezEnriquez, A. M.; Afraz, Z. S.; Talha, W.
Show all (5)
1 Citations
Vision loss is one of ultimate obstacle in the lives of blind that prevent them to perform tasks on their own and selfreliantly. The blind are trusting on others for the selection of trendy and eyecatching accessories because self –buying effort lead them in such collection that is mismatch with their personalities and society style. That is why they are bound to depend upon on their family for shopping assistance, who often may not afford quality time due to busy routine. The thought of dependency rises lack of selfconfidence in blinds, absorbs their ability to negotiate, decision making power, and social activities. Via uninterrupted speech communication, our proposed talking accessories selector assistant for the blind provides quick decision support in picking the routinely wearable accessories like dress, shoes, cosmetics, according to the society drifts and events. The foremost determination of this assistance is to make the blind liberated and more assertive.
more …
By
Rosenblueth, David A.; Muñoz, Stalin; Carrillo, Miguel; Azpeitia, Eugenio
Show all (4)
6 Citations
Boolean networks are important models of gene regulatory networks. Such models are sometimes built from: (1) a gene interaction graph and (2) a set of biological constraints. A gene interaction graph is a directed graph representing positive and negative gene regulations. Depending on the biological problem being solved, the set of biological constraints can vary, and may include, for example, a desired set of stationary states. We present a symbolic, SATbased, method for inferring synchronous Boolean networks from interaction graphs augmented with constraints. Our method first constructs Boolean formulas in such a way that each truth assignment satisfying these formulas corresponds to a Boolean network modeling the given information. Next, we employ a SAT solver to obtain desired Boolean networks. Through a prototype, we show results illustrating the use of our method in the analysis of Boolean gene regulatory networks of the Arabidopsis thaliana root stem cell niche.
more …
By
HernándezLeón, Raudel; HernándezPalancar, José; CarrascoOchoa, J. A.; MartínezTrinidad, José Fco.
Show all (4)
In Associative Classification, building a classifier based on Class Association Rules (CARs) consists in finding an ordered CAR list by applying a rule ordering strategy. Since this CAR list will be used to build a classifier, it is important to develop a good rule ordering strategy. In this paper, we introduce four novel hybrid rule ordering strategies; the first three combine the Netconf measure with SupportConfidence based rule ordering strategies. The fourth strategy, called Hybrid Specific Rules/Netconf (SR/NF), combines the Netconf measure with a rule ordering strategy based on the CAR’s size. The experiments show that the proposed strategies obtain better classification accuracy than the best ordering strategies reported in the literature.
more …
By
CaballeroMorales, SantiagoOmar
In this paper an approach based on insertion of “markers” is proposed to increase the performance of face recognition based on principal component analysis (PCA). The markers represent zerovalued pixels which are expected to remove information likely to affect classification (noisy pixels). The patterns of the markers was optimized with a genetic algorithm (GA) in contrast to other noise generation techniques. Experiments performed with a well known face database showed that the technique was able to achieve significant improvements on PCA particularly when data for training was small in comparison with the size of testing sets. This was also observed when the number of eigenfaces used for classification was small.
more …
By
Escalante, Hugo Jair; Sotomayor, Mauricio; Montes, Manuel; LopezMonroy, A. Pastor
Show all (4)
Naïve Bayes nearest neighbors (NBNN) is a variant of the classic KNN classifier that has proved to be very effective for object recognition and image classification tasks. Under NBNN an unseen image is classified by looking at the distance between the sets of visual descriptors of test and training images. Although NBNN is a very competitive pattern classification approach, it presents a major drawback: it requires of large storage and computational resources. NBNN’s requirements are even larger than those of the standard KNN because sets of raw descriptors must be stored and compared, therefore, efficiency improvements for NBNN are necessary. Prototype generation (PG) methods have proved to be helpful for reducing the storage and computational requirements of standard KNN. PG methods learn a reduced subset of prototypical instances to be used by KNN for classification. In this contribution we study the suitability of PG methods for enhancing the capabilities of NBNN. Throughout an extensive comparative study we show that PG methods can reduce dramatically the number of descriptors required by NBNN without significantly affecting its discriminative performance. In fact, we show that PG methods can improve the classification performance of NBNN by using much less visual descriptors. We compare the performance of NBNN to other stateoftheart object recognition approaches and show the combination of PG and NBNN outperforms alternative techniques.
more …
By
Sandoval, Guillermo; Vazquez, Roberto A.; Garcia, Paulina; Ambrosio, Jose
Show all (4)
3 Citations
Agricultural activities could represent an important sector for the economy of certain countries. In order to maintain control of this sector, it is necessary to schedule censuses on a regular basis, which represents an enormous cost. In recent years, different techniques have been proposed with the objective of reducing the cost and improving automation, these cover from Personal Digital Assistants usage to satellite image processing. In this paper, we described a methodology to perform a crop classification task over satellite images based on the Gray Level CoOccurrence Matrix (GLCM) and Radial Basis Function (RBF) neural network. Furthermore, we study how different color spaces could be applied to analyze satellite images. To test the accuracy of the proposal, we apply the methodology over a region and we present a comparison by evaluating the efficiency using three color spaces and different distance classifiers.
more …
