Showing 1 to 70 of 70 matching Articles
Results per page:
Export (CSV)
By
Korjik, Valeri; MoralesLuna, Guillermo
Post to Citeulike
3 Citations
We consider a scenario where information hiding (IH) is performed through noisy channels. There may arise different situations but one of the most common is the case where the legal IH channel is superior to the attacker IH channel. If a special randomized encoding is used by legal users then it is possible to hide information in the noisy components of the cover message. At the same time, the randomized encoding prevents the secret message to be removed from the stegomessage without any significant distortion of the cover message. If a legal decoder of IH knows the cover message, a randomized encoding procedure does not prevent the error correction of the secret message at the receiving side. The special problem of IH  how to distinguish any binary periodic repetitive sequence from truly random binary noise received on noisy channels  is presented. Application of the randomized encoding technique makes a solution to this problem more difficult and hence complicates a traffic analysis. We consider also how is it possible to “camouflage” IH by natural channel noises independently of the properties of the cover messages probability space, and the application of WM in combination with randomized encoding dedicated to the utilization of noisy channels. If an attacker tries to remove WM by adding binary noise then an expansion of errors in the cover message results.
more …
By
Farré, Jacques; Gálvez, José Fortes
Post to Citeulike
4 Citations
Parser generation tools currently used for computer language analysis rely on user wisdom in order to resolve grammar conflicts. Here practical LR(0)based parser generation is introduced, with automatic conflict resolution by potentiallyunbounded lookahead exploration. The underlying LR(0)automaton item dependence graph is used for lookahead DFA construction. A bounded graphconnect technique overcomes the difficulties of previous approaches with empty rules, and compact coding allows to precisely resume righthand contexts. Resulting parsers are deterministic and linear, and accept a large class of LRregular grammars including LALR(k). Their construction is formally introduced, shown to be decidable, and illustrated by a detailed example.
more …
By
Coello Coello, Carlos A.
Post to Citeulike
59 Citations
This tutorial will review some of the basic concepts related to evolutionary multiobjective optimization (i.e., the use of evolutionary algorithms to handle more than one objective function at a time). The most commonly used evolutionary multiobjective optimization techniques will be described and criticized, including some of their applications. Theory, test functions and metrics will be also discussed. Finally, we will provide some possible paths of future research in this area.
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
In this chapter we give the fundamentals of Lie algebra and the algebra of incidence using the computational frameworks of the null cone and the ndimensional affine plane. Using Lie algebra within this computational framework has the advantage that it is easily accessible to the reader because there is a direct translation of the familiar matrix representations to representations using bivectors from geometric algebra. Generally speaking, Lie group theory is the appropriate tool for the study and analysis of the action of a group on a manifold. Since we are interested in purely geometric computations, the use of geometric algebra is appropriate for carrying out computations in a coordinatefree manner by using a bivector representation of the most important Lie algebras. We will represent Lie operators using bivectors for the computation of a variety of invariants. This chapter benefits of work done in colaboration with Sobczyk [11, 100].
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
This chapter will demonstrate that geometric algebra provides a simple mechanism for unifying current approaches in the computation and application of projective invariants using nuncalibrated cameras. First, we describe Pascal’s theorem as a type of projective invariant, and then the theorem is applied for computing cameraintrinsic parameters. The fundamental projective invariant crossratio is studied in one, two, and three dimensions, using a single view and then n views. Next, by using the observations of two and three cameras, we apply projective invariants to the tasks of computing the viewcenter of a moving camera and to simplified visually guided grasping. The chapter also presents a geometric approach for the computation of shape and motion using projective invariants within a purely geometric algebra framework [47, 49]. Different approaches for projective reconstruction have utilized projective depth [96, 102], projective invariants [23], and factorization methods [81, 105, 107] (factorization methods incorporate projective depth calculations). We compute projective depth using projective invariants, which depend on the use of the fundamental matrix or trifocal tensor. Using these projective depths, we are then able to initiate a projective reconstruction procedure to compute shape and motion. We also apply the algebra of incidence in the development of geometric inference rules to extend 3D reconstruction.
more …
By
Chávez, Edgar; Navarro, Gonzalo
Post to Citeulike
14 Citations
Range searches in metric spaces can be very difficult if the space is “high dimensional”, i.e. when the histogram of distances has a large mean and/or a small variance. This socalled “curse of dimensionality”, well known in vector spaces, is also observed in metric spaces. There are at least two reasons behind the curse of dimensionality: a large search radius and/or a high intrinsic dimension of the metric space. We present a general probabilistic framework based on stretching the triangle inequality, whose direct effect is a reduction of the effective search radius. The technique gets more effective as the dimension grows, and the basic principle can be applied to any search algorithm. In this paper we apply it to a particular class of indexing algorithms. We present an analysis which helps understand the process, as well as empirical evidence showing dramatic improvements in the search time at the cost of a very small error probability.
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
This chapter presents the geometric algebra framework for dealing with 3D kinematics. The reader will see the usefulness of this mathematical approach for applications in computer vision and kinematics. We start with an introduction to 4D geometric algebra for 3D kinematics. Then we reformulate, using 3D and 4D geometric algebras, the classic model for the 3D motion of vectors. Finally, we compare both models, that is, the one using 3D Euclidean geometric algebra and our model, which uses 4D motor algebra.
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
This chapter is devoted to lowlevel image processing, applying geometric algebra techniques. We will show that using this mathematical system it is possible to develop powerful algorithms for image filtering, pattern recognition, feature detection, image segmentation, texture analysis, and image analysis in frequency domain. These techniques are fundamental for automated visual inspection, robot guidance, medical imaging, and analysis of image sequences, as well as for satellite and aerial photogrammetry.
more …
By
Borowsky, E.; Gafni, E.; Lynch, N.; Rajsbaum, S.
Show all (4)
Post to Citeulike
53 Citations
Abstract.
We present a shared memory algorithm that allows a set of f+1 processes to waitfree “simulate” a larger system of n processes, that may also exhibit up to f stopping failures.
Applying this simulation algorithm to the ksetagreement problem enables conversion of an arbitrary kfaulttolerant{\it n}process solution for the ksetagreement problem into a waitfree k+1process solution for the same problem. Since the k+1processksetagreement problem has been shown to have no waitfree solution [5,18,26], this transformation implies that there is no kfaulttolerant solution to the nprocess ksetagreement problem, for any n.
More generally, the algorithm satisfies the requirements of a faulttolerant distributed simulation.\/ The distributed simulation implements a notion of faulttolerant reducibility\/ between decision problems. This paper defines these notions and gives examples of their application to fundamental distributed computing problems.
The algorithm is presented and verified in terms of I/O automata. The presentation has a great deal of interesting modularity, expressed by I/O automaton composition and both forward and backward simulation relations. Composition is used to include a safe agreement\/ module as a subroutine. Forward and backward simulation relations are used to view the algorithm as implementing a multitry snapshot\/ strategy.
The main algorithm works in snapshot shared memory systems; a simple modification of the algorithm that works in read/write shared memory systems is also presented.
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
It appears that for biological creatures, the external world may be internalized in terms of intrinsic geometric representations. We can formalize the relationships between the physical signals of external objects and the internal signals of a biological creature by using extrinsic vectors to represent those signals coming from the world and intrinsic vectors to represent those signals originating in the internal world. We can also assume that external and internal worlds employ different reference coordinate systems. If we consider the acquisition and coding of knowledge to be a distributed and differentiated process, we can imagine that there should exist various domains of knowledge representation that obey different metrics and that can be modeled using different vectorial bases. How it is possible that nature should have acquired through evolution such tremendous representational power for dealing with such complicated signal processing [60]. In a stimulating series of articles, Pellionisz and Llinàs [77, 78] claim that the formalization of geometrical representation seems to be a dual process involving the expression of extrinsic physical cues built by intrinsic central nervous system vectors. These vectorial representations, related to reference frames intrinsic to the creature, are covariant for perception analysis and contravariant for action synthesis. The geometric mapping between these two vectorial spaces can thus be implemented by a neural network which performs as a metric tensor [78].
more …
By
Aguirre, Alfonso San Miguel; Vardi, MosheY.
Post to Citeulike
8 Citations
This paper contains an experimental study of the impact of the construction strategy of reduced, ordered binary decision diagrams (ROBDDs) on the averagecase computational complexity of random 3SAT, using the CUDD package.We study the variation of median running times for a large collection of random 3SAT problems as a function of the density as well as the order (number of variables) of the instances.We used ROBDDbased pure SATsolving algorithms, which we obtained by an aggressive application of existential quantification, augmented by several heuristic optimizations. Our main finding is that our algorithms display an “easyhardlesshard” pattern that is quite similar to that observed earlier for searchbased solvers. When we start with lowdensity instances and then increase the density, we go from a region of polynomial running time, to a region of exponential running time, where the exponent first increases and then decreases as a function of the density. The locations of both transitions, from polynomial to exponential and from increasing to decreasing exponent, are algorithm dependent. In particular, the running time peak is quite independent from the crossover density of 4.26 (where the probability of satisfiability declines precipitously); it occurs at density 3.8 for one algorithm and at density 2.3 for for another, demonstrating that the correlation between the crossover density and computational hardness is algorithm dependent.
more …
By
Iglesias, Juan R.; Gupta, Gopal; Pontelli, Enrico; Ranjan, Desh; Milligan, Brook
Show all (5)
Post to Citeulike
2 Citations
The goal of this project is to develop solutions to enhance interoperability between bioinformatics applications. Most existing applications adopt different data formats, forcing biologists into tedious translation work. We propose to use of Nexus as an intermediate representation language. We develop a complete grammar for Nexus and we adopt this grammar to build a parser. The construction heavily relies on the peculiar features of Prolog, to easily derive effective parsing and translating procedures. We also develop a general parse tree format suitable for interconversion between Nexus andot her formats.
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
Geometric algebra is a coordinatefree approach to geometry based on the algebras of Grassmann [38] and Clifford [20]. The geometric approach to Clifford algebra adopted in this book was pioneered in the 1960s by David Hestenes [51], who has since worked on developing his version of Clifford algebra—which will be referred to as geometric algebra in this volume—into a unifying language for mathematics and physics [47, 4]. Hestenes also presented a study of projective geometry using Clifford algebra [49]. The introductory sections of this chapter will present the basic definitions of geometric algebra. In the text, we denote scalars with lowercase letters, matrices with uppercase letters, and we use bold lowercase for both vectors in three dimensions and the bivector parts of spinors. Spinors and dual quaternions in four dimensions are denoted by bold uppercase letters.
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
This chapter presents the formulation of robot manipulator kinematics within the geometric algebra framework. In this algebraic system the 3D Euclidean motion of points, lines, and planes can be advantageously represented using the algebra of motors. The computational complexity of direct and indirect kinematics and other problems concerning robot manipulators are dependent on the robot’s degrees of freedom as well on its geometric characteristics. Our approach makes possible a direct algebraic formulation of the concrete problem in such a way that it reflects the underlying geometric structure. This is achieved by describing parts of the problem based on motor representations of points, lines, and planes where necessary. The chapter presents the formulation and computation of closedform solutions of direct and indirect kinematics for standard robot manipulators and a simple example of a grasping task. The flexible method presented here widens the current standard point or line representationbased approaches for the treatment of problems related to robot manipulators.
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
This chapter presents a mathematical approach based on geometric algebra for the computation of problems in computer vision. We will show that geometric algebra is a wellfounded and elegant language for expressing and implementing those aspects of linear algebra and projective geometry that are useful for computer vision. Since geometric algebra offers both geometric insight and algebraic computational power, it is useful for tasks such as the computation of projective invariants, camera calibration, and the recovery of shape and motion. We will mainly focus on the geometry of multiple uncalibrated cameras.
more …
By
Corrochano, Eduardo Bayro
Post to Citeulike
This chapter is dedicated to the estimation of 3D Euclidean transformation using motor algebra. Two illustrations of estimation procedures are given: the first uses a batch approach for the estimation of the unknown 3D transformation between the coordinate reference systems of a robot neck, or arm, and of a digital camera. This problem is called the handeye problem and it is solved using a motionoflines model. The second illustration uses a recursive estimation method based on Kaiman filter techniques. After introducing the motor extended Kaiman filter (MEKF), we estimate 3D rigid motion using the MEKF and 3D lines gained by a visual robot system. These two approaches show that the task of estimating 3D rigid motion is easier using motor algebra because the motionoflines model is linear. This chapter benefits by work done in colaboration with Daniilidis and Zang [8, 12, 26].
more …
By
Korjik, Valeri; MoralesLuna, Guillermo; Balakirsky, Vladimir B.
Post to Citeulike
4 Citations
Secret key agreement protocol between legal parties based on reconciliation and privacy amplification procedure has been considered in [2]. The so called privacy amplification theorem is used to estimate the amount of Shannon’s information leaking to an illegal party (passive eavesdropper) about the final key.We consider a particular case where one of the legal parties (Alice) sends to another legal party (Bob) a random binary string x through a binary symmetric channel (BSC) with bit error probability ε_{m} while an eavesdropper (Eve) receives this string through an independent BSC with bit error probability ε_{w}. We assume that ε_{m} < ε_{w} and hence the main channel is superior to the wiretap channel. To reconcile the strings between legal parties Alice sends to Bob through noiseless channel the check string y based on some good error correcting code. Since this transmission is completely public Eve can eavesdrop it and therefore this extra information has to be taken into account in an estimation of the information leaking to Eve about the final key. In [3] an inequality has been proved to upper bound the information of Eve in such scenario. The main contribution of the running paper is to improve this inequality and hence to enhance the privacy amplification theorem. We present also bounds for the probability of false reconciliation when the check symbols of the linear code are transmitted through noiseless channel. The presented results can be very useful when considering the nonasymptotic case.
more …
By
Osorio, Mauricio; Nieves, Juan Carlos
Post to Citeulike
The stable semantics has become a prime candidate for knowledge representation and reasoning. The rules associated with propositional logic programs and the stable semantics are not expressive enough to let one write concise optimization programs. We propose an extension to the language of logic programs that allows one to express optimization problems in a suitable well. In earlier work we defined the declarative semantics for partial order clauses. The main contribution of our paper is the following: First, we define the language of our extended paradigm as well as its declarative semantics. Our declarative semantics is based on translating partial order clauses into normal programs and the using the stable semantics as the intended meaning of the original program. Second, we propose an operational semantics for our paradigm. Our experimental results show that our approach is more efficient than using the well known system SMODELS over the translated program.
more …
By
Coello Coello, Carlos A.; Pulido, Gregorio
Post to Citeulike
107 Citations
In this paper, we propose a multiobjective optimization approach based on a micro genetic algorithm (microGA) which is a genetic algorithm with a very small population (four individuals were used in our experiment) and a reinitialization process. We use three forms of elitism and a memory to generate the initial population of the microGA. Our approach is tested with several standard functions found in the specialized literature. The results obtained are very encouraging, since they show that this simple approach can produce an important portion of the Pareto front at a very low computational cost.
more …
By
Korjik, Valeri; Morozov, Kirill
Post to Citeulike
6 Citations
The main cryptographic primitives (Bit Commitment (BC) and Oblivious Transfer (OT) protocols) based on noisy channels have been considered in F[1] for asymptotic case. Nonasymptotic behavior of BC protocol has been demonstrated in [2]. The current paper provides stricter asymptotic conditions on Binary Symmetric Channel (BSC) to be feasible OT protocol proposed in [1]. We also generalize this protocol using different encoding and decoding methods that require to regain formulas for Renyi entropy. Nonasymptotic case (finite length of blocks transmitted between parties) is also presented. Some examples are given to demonstrate that these protocols are in fact reliable and informationtheoretically secure. We also discuss the problem — how to extend ( _{1}/^{2})OT protocol to (_{1}^{L})OT protocol and how to arrange BSC connecting parties. Both BC and OT protocols can be used as components of more complex and more important for practice protocols like “Digital cash”, “Secure election” or “Distance bounding”.
more …
By
Bolshakov, Igor A.; Gelbukh, Alexander
Post to Citeulike
8 Citations
The problem of automatic text segmentation is subcategorized into two different problems: thematic segmentation into rather large topically selfcontained sections and splitting into paragraphs, i.e., lexicogrammatical segmentation of lower level. In this paper we consider the latter problem. We propose a method of reasonably splitting text into paragraph based on a text cohesion measure. Specifically, we propose a method of quantitative evaluation of text cohesion based on a large linguistic resource  a collocation network. At each step, our algorithm compares word occurrences in a text against a large DB of collocations and semantic links between words in the given natural language. The procedure consists in evaluation of the cohesion function, its smoothing, normalization, and comparing with a specially constructed threshold.
more …
By
Monroy, Raúl
Post to Citeulike
The analysis of failed proof attempts is central to concept formation. This process essentially makes use of abduction, a form of reasoning that identifies explanations for an observed phenomenon. The main problem with abduction is the combinatorial explosion: the search space originated from a failed proof attempt is often unmanageable. A careful proof search guidance is thus required to enable a successful analysis of failure. Fortunately, the search space generated by proof planning [6] is moderately small. Using an abduction mechanism built upon proof planning [18], we have successfully patched 40 faulty conjectures about the HOL theory of lists [3]. On each faulty conjecture, our mechanism was able to synthesise a condition that turns the conjecture into a theorem. Each condition proved to be a concept that is known to be useful and interesting. This process is a form of concept formation. Concept formation was done automatically. Once refined, the conjectures can then be used to write a fast, uniform proof procedure for proving properties of list constants without effort.
more …
By
SolanoSoto, Jaime; Sucar, Luis Enrique
Post to Citeulike
1 Citations
A novel configuration method for systems design has been developed, that considers, at the same time, system reliability and cost. This method helps to maximize the reliability, minimize the cost and obtain the best possible configuration for the system to be designed. To accomplish this, a combination of Bayesian networks and heuristic search are used so to help the designer find the optimum configuration in the immense search space available. The method has as entry parameters: the minimal reliability requirement or maximum cost of the computer system to be designed, the function of the system as a reliability block diagram and a description of each component. From this input, the methodology transforms automatically the reliability block diagram to Bayesian network equivalent, from which the reliability of the system is obtained through probability propagation. Starting form the initial block diagram, a set of heuristic operators is used to generate new configurations. The “best” configurations are obtained using beam search with some heuristics to improve the search efficiency. There are 3 alternatives for defining the best configurations: (i) minimize cost with a reliability restriction, (ii) maximize reliability with a cost restriction, and (iii) make a compromise between reliability and cost (Pareto set). The methodology is applied to the design of a distributed control system with promising results.
more …
By
Akiyama, Jin; Sakai, Toshinori; Urrutia, Jorge
Post to Citeulike
A kdissection D of a polygon P, is a partition of P into a set pf subpolygons {Q_{1},...,Q_{m}} with disjoint interiors such that these can be reassembled to form k polygons P_{1},...,P_{k} all similar to P. If for every j, 1 ≤ j ≤ k, the pieces of D can be assembled into j polygons, all similar to P, then D is called a sequentially kdivisible dissection. In this paper we show that any convex ngon, n ≤ 5, has a sequentially kdivisible dissection with (k  1)n+1 pieces. We give sequentially kdivisible dissections for some regular polygons with n ≥ 6 vertices. Furthermore, we show that any simple polygon P with n vertices has a (3k+4)dissection with (2n  2) +k(2n + ⌊ n/3 ⌋  4) pieces, k ≤ 0, that can be reassembled to form 4,7,..., or 3k + 4 polygons similar to P. We give similar results for star shaped polygons.
more …
By
Aguirre, Arturo Hernández; Buckles, Bill P.; Coello, Carlos Coello
Post to Citeulike
1 Citations
The number of samples needed to learn an instance of the representation class kDNF_{n}^{s}
of Boolean formulas is predicted using some tolerance parameters by the PAC framework. When the learning machine is a simple genetic algorithm, the initial population is an issue. Using PAClearning we derive the population size that has at least one individual at some given Hamming distance from the optimum. Then we show that the population does not need to be close to the optimum in order to learn the concept.
more …
By
Mariano, Carlos E.; Morales, Eduardo F.
Post to Citeulike
6 Citations
In reinforcement learning an autonomous agent learns an optimal policy while interacting with the environment. In particular, in onestep Qlearning, with each action an agent updates its Q values considering immediate rewards. In this paper a new strategy for updating Q values is proposed. The strategy, implemented in an algorithm called DQL, uses a set of agents all searching the same goal in the same space to obtain the same optimal policy. Each agent leaves traces over a copy of the environment (copies of Qvalues), while searching for a goal. These copies are used by the agents to decide which actions to take. Once all the agents reach a goal, the original Qvalues of the best solution found by all the agents are updated using Watkins’ Qlearning formula. DQL has some similarities with Gambardella’s AntQ algorithm [4], however it does not require the definition of a domain dependent heuristic and consequently the tuning of additional parameters. DQL also does not update the original Qvalues with zero reward while the agents are searching, as AntQ does. It is shown how DQL’s guided exploration of several agents with selected exploitation (updating only the best solution) produces faster convergence times than Qlearning and AntQ on several test bed problems under similar conditions.
more …
By
Horebeek, Johan; TapiaRodriguez, Ernesto
Post to Citeulike
We study the limiting behaviour of the repeated application of a discrete image operator that belongs to a recently introduced class of morphological operators. To this purpose, we describe the implied iterative process by a discrete dynamical system.
The convergence is derived for binary and gray scale images.
more …
By
Pérez, E. Islas; Coello, C. A. Coello; Aguirre, A. Hernández
Post to Citeulike
3 Citations
In this paper we show a scheme based on casebased reasoning to extract design patterns from a genetic algorithm used to optimize combinational circuits at the gatelevel. The approach is able to rediscover several of the traditional Boolean rules used for circuit simplification and it also finds new simplification rules. Also, we illustrate how the approach can be used to reduce convergence times of a genetic algorithm using previously found solutions as cases to solve similar problems.
more …
By
MartinezTrinidad, José Fco.; SánchezDiaz, Guillermo
Post to Citeulike
4 Citations
An algorithm for automated construction of conceptual classification is presented. The algorithm arranges the objects into clusters using clustering criterion based on graph theory and constructs the concepts based on attributes that distinguish objects in the different computed clusters (typical testors). The algorithm may be applied in problems where the objects can be described simultaneously by qualitative and quantitative attributes (mixed data); with incomplete descriptions (missing data) and the number of clusters to be formed is not known a priori. LC has been tested on data from standard public databases.
more …
By
Chandra, Charu; Smirnov, Alexander V.; Sheremetov, Leonid B.
Post to Citeulike
Supplychain network (SCN), a society formed by autonomous agents to solve a common logistics problem is a philosophy for enterprise integration. A multi agent infrastructure for information support of SCN management is proposed. It consists of SCN problem domain agents and an agent platform. Member acts as a Problem domain agent, while Group serves as a Supply Chain Advisor (SCA). An agent platform consists of Directory Facilitator, Agent Management System, Agent Communication Channel, Internal Platform Message Transport, and Wrapper Agents. Use of this platform ensures interoperability among agents and reusability of components and services.
more …
By
Barrera Cortés, J.; Baruch, I.; Valdez Castro, L.; Vázquez Cervantes, V.
Show all (4)
Post to Citeulike
The paper proposed to use a new Recurrent Neural Network Model (RNNM) to stabilize fermentation process of Bacillus thuringiensis from fermentation kinetic data. The multiinput multioutput RNNM proposed, have ten inputs, six outputs, sixteen neurones in the hidden layer, and also global and local feedbacks. The weight update learning algorithm designed, is a version of the well known backpropagation through time algorithm, directed to the RNNM learning. The approximation error for the last epoch of learning is about 2% and the total time of learning is 201 epochs, where the size of epoch is 115 iterations.
more …
By
Gelbukh, Alexander; Sidorov, Grigori
Post to Citeulike
15 Citations
We observed that the coefficients of two important empirical statisti cal laws of language  Zipf law and Heaps law  are different for different lan guages, as we illustrate on English and Russian examples. This may have both theoretical and practical implications. On the one hand, the reasons for this may shed light on the nature of language. On the other hand, these two laws are im portant in, say, fulltext database design allowing predicting the index size.
more …
By
ToscanoMedina, K.; SanchezPerez, G.; NakanoMiyatake, M.; PerezMeana, H.
Show all (4)
Post to Citeulike
The signature recognition is a topic of intensive research due to its great importance, among others, in the financial system. However it does not exist yet an enough reliable method for signature recognition and verification, especially in the forgeries detection. This paper presents an offline signature recognition using features extracted from the offline signature and an array of five growing cell neural network. The proposed system was evaluated using 950 signatures of 19 different persons. Experimental results show that proposed system achieves a fairly good recognition rate with a relatively low computational complexity
more …
By
MontesyGómez, M.; Gelbukh, A.; LópezLópez, A.; BaezaYates, R.
Show all (4)
Post to Citeulike
20 Citations
Conceptual graphs allow for powerful and computationally affordable representation of the semantic contents of natural language texts. We propose a method of comparison (approximate matching) of conceptual graphs. The method takes into account synonymy and subtype/supertype relationships between the concepts and relations used in the conceptual graphs, thus allowing for greater flexibility of approximate matching. The method also allows the user to choose the desirable aspect of similarity in the cases when the two graphs can be generalized in different ways. The algorithm and examples of its application are presented. The results are potentially useful in a range of tasks requiring approximate semantic or another structural matching  among them, information retrieval and text mining.
more …
By
CampoyCervera, Pascual; Mun̄ozGarcía, David F.; Pen̄a, Daniel; CalderónMartínez, José A.
Show all (4)
Post to Citeulike
2 Citations
This paper presents an implementation of a digital filtering inspection system applied on a paper pulp sheet production process. The automation of the inspection phase is particularly critical during this process and its solution is highly complex. The system is based on neural network learning, allowing a compromise between resolution and processing speed. The experimental results demonstrating the use of this algorithm for the visual detection of defects in images obtained from a real factory environment are presented. These results show that the developed learning method generates filters that fulfil the required inspection standard.
more …
By
Alexandrov, Mikhail; Gelbukh, Alexander; Lozovoi, George
Post to Citeulike
5 Citations
The problem of document categorization is considered. The set of domains and the keywords specific for these domains is supposed to be selected beforehand as initial data. We apply the wellknown statistical hypothesis test that considers images of documents and domains as normalized vectors. In comparison with existing methods, such approach allows to take into account a random character of initial data. The classifier is developed in the framework of Document Investigator software package.
more …
By
TerolVillalobos, Iván R.; VargasVázquez, Damián
Post to Citeulike
5 Citations
In this paper, a class of openings and closings is investigated using the notion of propagation criteria. The main goal in studying these transformations consists in eliminating some inconveniences of the morphological opening (closing) and the opening (closing) by reconstruction. The idea in building these new openings and closings comes from the notions of morphological filters by recontruction and levelings. Morphological filters by reconstruction extract, from an image, the connected components that are marked. The reconstruction process of the input image is achieved using geodesic dilation (or erosion) until stability. However, since thin connections exist, these filters reconstructs too much and sometimes it is impossible to eliminate some undesirable regions. Because of this inconvenience, propagation criteria must be introduced.
more …
By
Bolshakov, Igor; Gelbukh, Alexander
Post to Citeulike
12 Citations
A computational system manages a very large database of colloca tions (word combinations) and semantic links. The collocations are related (in the meaning of a dependency grammar) word pairs, joint immediately or through prepositions. Synonyms, antonyms, subclasses, superclasses, etc. repre sent semantic relations and form a thesaurus. The structure of the system is uni versal, so that its languagedependent parts are easily adjustable to any specific language (English, Spanish, Russian, etc.). Inference rules for prediction of highly probable new collocations automatically enrich the database at runtime. The inference is assisted by the available thesaurus links. The aim of the system is word processing, foreign language learning, parse filtering, and lexical dis ambiguation.
more …
By
Baruch, Ieroham; Flores, José Martín; Garrido, Ruben; Nenkova, Boyka
Show all (4)
Post to Citeulike
The paper proposes to use three Recurrent Trainable Neural Network RTNN models for real time DC motor system identification and state feedbackfeedforward control. The proposed RTNN model have a Jordan canonical structure which permits to use the generated vector of states directly for DC motor feedback control. A Backpropagation throughtime type learning algorithm for RTNN model training, is also described. The experimental results, confirms the applicability of the described identification and control methodology in practice. The given results of nonlinear mechanical system identification and control by means of three RTNN models show a good convergence and confirm RTNN qualities.
more …
By
GaliciaHaro, Sofía N.; Gelbukh, Alexander; Bolshakov, Igor A.
Post to Citeulike
Structural ambiguity is one of the most difficult problems in natural language processing. Two disambiguation mechanisms for unrestricted text analysis are commonly used: lexical knowledge and context considerations. Our parsing method includes three different mechanisms to reveal syntactic struc tures and an additional voting module to obtain the most probable structures for a sentence. The developed tools do not require any tagging or syntactic marking of texts.
more …
By
SimancasAcevedo, Eric; Kurematsu, Akira; Nakano Miyatake, Mariko; PerezMeana2, Hector
Show all (4)
Post to Citeulike
Control access to secret or personal information by using the speaker voice transmitted by long distance communication systems, such as the telephone system, requires accuracy and robustness of the identification or identity verification system, since the speech signal is distorted during the transmission process. Taking in consideration these requirements, a robust text independent speaker identifications system is proposed in which the speaker features are extracted using the Lineal Prediction Cepstral Coefficients (LPCEPSTRAL) and the Gaussian Mixture Models, which provides the features distribution and estimates the optimum model for each speaker, is used for identification. The proposed system, was evaluate using a database of 80 different speakers, with a pronoun phrase of 35s and digits in Japanese language stored during 4 months. Evaluation results show that proposed system achieves more than 90% of recognition rate.
more …
By
Bolshakova, Elena I.
Post to Citeulike
1 Citations
The intensive use of terms of a specific terminology is admittedly one of the most distinguishing features of scientific and technical (scitech) texts. We propose to categorize the terms as either dictionary units of generally accepted terminology or new terms introduced into a scitech text to facilitate the description of new author’s concepts; we call such terms author’s. The pa per discusses issues concerned with authors. terms, including frequent ways of their definition and using in scitech texts. It is argued that recognition of author’s scitech terms are important for NLP applications such as computer aided scientific and literary editing, automatic text abstracting and summariza tion, or acquiring of expert knowledge. A sketch of the procedure for recogni tion of author’s terms with certain degree of accuracy is given.
more …
By
Escorial, David
Post to Citeulike
We present a model of an Information Space (IS) that combines the Vector Space Model (VSM) with conceptual hierarchies for Information Retrieval (IR). This model will allow navigating between both representations helping the user in the retrieval process. The goal is to filter the documents the user is interested based on the theme.
more …
By
Andina, Diego; VegaCorona, Antonio
Post to Citeulike
3 Citations
We propose a two steps method for the automatic classifi cation of microcalcifications in Mammograms. The first step performs the improvement of the visualization of any abnormal lesion through feature enhancement based in multiscale wavelet representations of the mammographic images. In a second step the automatic recognition of microcalcifications is achieved by the application of a Neural Network optimized in the NeymanPearson sense. That means that the Neural Network presents a controlled and very low probability of classifying abnormal images as normal.
more …
By
Chávez, Edgar; Marroquín, José L.; Navarro, Gonzalo
Post to Citeulike
31 Citations
Pivotbased algorithms are effective tools for proximity searching in metric spaces. They allow trading space overhead for number of distance evaluations performed at query time. With additional search structures (that pose extra space overhead) they can also reduce the amount of side computations. We introduce a new data structure, the Fixed Queries Array (FQA), whose novelties are (1) it permits sublinear extra CPU time without any extra data structure; (2) it permits trading number of pivots for their precision so as to make better use of the available memory. We show experimentally that the FQA is an efficient tool to search in metric spaces and that it compares favorably against other state of the art approaches. Its simplicity converts it into a simple yet effective tool for practitioners seeking for a blackbox method to plug in their applications.
more …
By
AriasEstrada, Miguel; Xicotencatl, Juan M.
Post to Citeulike
7 Citations
In this paper, an FPGA based architecture for stereo vision is presented. The architecture provides a highdensity disparity map in real time. The architecture is based on area comparison between an image pair using the sum of absolute differences. The architecture scans the input images in partial columns, which are then processed in parallel. The system performs monolithically on a pair of images in real time. An extension to the basic architecture is proposed in order to compute disparity maps on more than 2 images.
more …
By
Aguilera, A.; Ayala, D.
Post to Citeulike
2 Citations
In recent published papers we presented the Extreme Vertices Model (EVM), a concise and complete model for representing orthogonal polyhedra and pseudopolyhedra (OPP). This model exploits the simplicity of its domain by allowing robust and simple algorithms for setmembership classification and Boolean operations that do not need to perform floatingpoint operations.
Several applications of this model have also been published, including the suitability of OPP as geometric bounds in Constructive Solid Geometry (CSG).
In this paper, we present an algorithm which converts from this model into a BRep model. We also develop the application of the Alternating Sum of Volumes decomposition to this particular type of polyhedra by taking advantage of the simplicity of the EVM. Finally we outline our future work, which deals with the suitability of the EVM in the field of digital images processing.
more …
By
MontesyGómez, M.; Gelbukh, A.; LópezLópez, A.
Post to Citeulike
News reports are an important source of information about society. Their analysis allows understanding its current interests and measuring the social importance and influence of different events. In this paper, we use the analysis of news as a means to explore the society interests. We focus on the study of a very common phenomenon of news: the influence of the peak news topics on other current news topics. We propose a simple, statistical text mining method to analyze such influences. We differentiate between the observable associations— those discovered from the newspapers—and the realworld associations, and propose a technique in which the real ones can be inferred from the observable ones. We illustrate the method with some results obtained from preliminary experiments and argue that the discovery of the ephemeral associations can be translated into knowledge about interests of society and social behavior.
more …
By
Olague, Gustavo
Post to Citeulike
5 Citations
This work describes the use of genetic algorithms for automating the photogrammetric network design process. When planning a photogrammetric network, the cameras should be placed in order to satisfy a set of interrelated and competing constraints. Furthermore, when the object is threedimensional a combinatorial problem occurs. Genetic algorithms are stochastic optimization techniques, which have proved useful at solving computationally difficult problems with high combinatorial aspects. EPOCA (an acronym for “Evolving POsitions of CAmeras”) has been developed using a threedimensional CAD interface. EPOCA is a genetic based system that provides the attitude of each camera in the network, taking into account the imaging geometry, as well as several major constraints like visibility, convergence angle, and workspace constraint. EPOCA reproduces configurations reported in the photogrammetric literature. Moreover, the system can design networks for several adjoining planes and complex objects opening interesting new research avenues.
more …
By
Bolshakov, Igor A.
Post to Citeulike
Personal pronouns of several European languages are redemarcated and parametrized by means of their morphological, syntactic, and semantic features. A universal semantic representation of personal pronouns as vectors of few semantic components is proposed. The main semantic components are implied by syntactic features: grammatical person, number, and gender. The re sults of parameterization are applied to morphosyntactic and semantic agree ment, translation of pronouns from one language to another, and disambigua tion of several verbal constructions.
more …
By
Velasco, Fernando A.; Marroquín, José L.
Post to Citeulike
8 Citations
Abstract.
Snakes are active contours that minimize an energy function. We present a new kind of active contours called “Sandwich Snakes”. They are formed by two snakes, one inside and the other outside of the curve that one is looking for. They have the same number of particles, which are connected in onetoone correspondence. At the minimum the two snakes have the same position. We also present here a multiscale system, where Sandwich Snakes are adjusted at increasing resolutions, and an interactive tool that permits one to easily specify the initial position for the Sandwich Snakes. Sandwich Snakes exhibit very good perfomance detecting contours with complex shapes, where the traditional methods fail. They are also very robust with respect to noise.
more …
By
Ramírez, J. Federico; Fuentes, Olac; Gulati, Ravi K.
Post to Citeulike
3 Citations
In this article we present a method for the automated prediction of stellar atmospheric parameters from spectral indices. This method uses a genetic algorithm (GA) for the selection of relevant spectral indices and prototypical stars and predicts their properties, using the knearest neighbors method (KNN). We have applied the method to predict the effective temperature, surface gravity, metallicity, luminosity class and spectral class of stars from spectral indices. Our experimental results show that the feature selection performed by the genetic algorithm reduces the running time of KNN up to 92%, and the predictive accuracy error up to 35%.
more …
By
Duursma, I. M.; Rentería, C.; TapiaRecillas, H.
Post to Citeulike
36 Citations
Abstract.
By using results and techniques from commutative algebra such as the vanishing ideal of a set of points, its ainvariant, the Hilbert polynomial and series, as well as finite free resolutions and the canonical module, some results about ReedMuller codes defined on a zerodimensional complete intersection in the nprojective dimensional space are given. Several examples of this class of codes are presented in order to illustrate the ideas.
more …
By
Molina, Arturo; Flores, Myrna
Post to Citeulike
In the Framework for Global Virtual Business proposed by the COSME network, a Virtual Enterprise Broker and Virtual Industry Clusters form a Virtual Organisation. The main objective of a Virtual Organisation is to create Virtual Enterprises to exploit business opportunities identified by the Virtual Enterprise Broker using core competencies deployed from partners from different Virtual Industry Clusters. Virtual Enterprise Brokers (VEB’s) will need to do much more than just the identification of market needs. The VEB is in charge of identifying, selecting and qualifying best partners for the Virtual Enterprise. Also the responsibility will be to manage the partners’ competencies. The objective of this paper is to propose a business model to exploit global business opportunities by VEB based in four main processes: ideation, basic development, advanced development and launching. These four main processes will describe all the activities and tasks that should be performed by the VEB’s in order to fulfil market requirements successfully through the Virtual Enterprise concept. Finally, a case study is presented to demonstrate how the VEB using a Virtual Industry Clusters share and deploy companies’ competencies in other to satisfy market demands.
more …
By
Caballero, Daniel; Molina, Arturo; Bauernhansl, Thomas
Post to Citeulike
Based on the Framework for Global Virtual Business developed by the COSME Network, the partners for a Virtual Enterprise (VE) has to be selected from members of a Virtual Industry Cluster (VIC). This paper describes 1) major issues involved in evaluating possible members of the VIC, i.e. core competencies (products, process, and technologies) and infrastructures (information social, legal and physical), 2) a methodology to select members for a VIC based on quantitative and qualitative analysis. This methodology integrates a set of benchmarking tools to evaluate enterprise’s competencies and infrastructures. A case study was undertaken to create VIC of Metal Mechanical and Plastics Industry in Monterrey, Mexico. A WWW page was created to describe member’s key information on competencies and infrastructures.
more …
By
Böttcher, A.; Karlovich, Yu. I.
Post to Citeulike
3 Citations
The Cauchy singular integral operator, S, is one of the main actors in the theory of Fourier convolutions, Toeplitz operators, RiemannHilbert problems, WienerHopf and singular integral equations. While the boundedness of S has been studied for many decades, final results on the spectrum of S were obtained only in recent times. During the last few years, it was discovered that there is a surprising and undreamtof metamorphosis of the (local) spectra of S from circular arcs via horns and logarithmic doublespirals to socalled logarithmic leaves with a halo. This article is a survey of this fascinating development. We hope it is both a feast for the eyes and an illustration of Nick Trefethen’s motto: The spectrum gives an operator a personality!
more …
By
Medina, Abraham; Treviño, César; Hernando, Pedro J.; Bonilla, Luis L.
Show all (4)
Post to Citeulike
Particle image velocimetry (PIV) has been used to experimentally study the velocity field during the discharge of a monolayer of monodisperse granular material (composed of 3 mmdiameter glass beads) in silos with different geometries. Silos were rectangular with the exit port located in the center or close to the lateral wall, and a convergent silo with angle ø_{w} = 60.. In all cases we characterized the mean velocity profiles and the fluctuation intensities. For silos with the exit port located at the center (symmetrical configurations), strong flow oscillations were detected, but not for the nonsymmetrical configuration. This last result is a strong indication that the mass flow rate is lower in the symmetrical silos. A simple theory was given for stationary flow in convergent silos and its corresponding parameters were calculated from experimental data. Numerical simulations show good agreement with measured velocity fields.
more …
By
NarvaezBerthelemot, Nora; Russell, Jane M.
Post to Citeulike
10 Citations
An analysis carried out on the 4,326 periodicals in the social sciences included in the mostrecent 1991 printed edition of the UNESCO DARE database showed that 64% of the world'sproduction is published by High Income Economy countries (IEC). Only 0.7% of Low IECjournals in the UNESCO database were also present in the Social Sciences Citation Index (SSCI)for the same year while corresponding figures for the Middle and High IEC were 2.3%, and97.0%, respectively. With the notable exception of the United States, all countries had fewerjournals in SSCI than in UNESCO database.
more …
By
HernándezLerma, Onésimo; Govindan, T. E.
Post to Citeulike
14 Citations
This paper concerns nonstationary continuoustime Markov control processes on Polish spaces, with the infinitehorizon discounted cost criterion. Necessary and sufficient conditions are given for a control policy to be optimal and asymptotically optimal. In addition, under suitable hypotheses, it is shown that the successive approximation procedure converges in the sense that the sequence of finitehorizon optimal cost functions and the corresponding optimal control policies both converge.
more …
By
Rabinovich, Vladimir S.; Roch, Steffen; Silbermann, Bernd
Post to Citeulike
7 Citations
We develop the stability theory for the finite section method for general banddominated operators on l^{p} spaces over Z^{k}. The main result says that this method is stable if and only if each member of a whole family of operators – the socalled limit operators of the method – is invertible and if the norms of these inverses are uniformly bounded.
more …
By
Stephens, C. R.; Mora Vargas, J.
Post to Citeulike
4 Citations
In paper I of this series we reviewed the recent development of an alternative paradigm for evolution on a fitness landscape–effective fitness–which offers an intuitive way to understand population dynamics as flows on an effective fitness landscape when genetic operators other than reproductive selection play an important role. In this article we demonstrate the utility of the concept using several simple analytical models and some more complex models that we simulate numerically. In particular, we show that effective fitness offers a qualitative and quantitative framework within which the phenomenon of induced symmetry breaking of the genotypephenotype map may be understood. As explicit examples we consider: the violation of the building block hypothesis in nonepistatic landscapes; selfadaptation of genetic algorithms in timedependent fitness landscapes and the appearance of evolutionary robustness as an emergent property in the evolution of language. In all cases we demonstrate that effective fitness offers a framework within which these diverse phenomena can be understood and in principle quantitatively studied.
more …
By
Böttcher, A.; Karlovich, Yu. I.; Spitkovsky, I.
Post to Citeulike
3 Citations
We establish a Fredholm criterion and an index formula for Toeplitz operators with semialmostperiodic matrix symbols on the Hardy spaces H^{p} (1<p<∞). Our main result completes the Fredholm theory of the aforementioned operators and generalizes previous results, which concerned the case p=2 or were based on certain additional assumptions, such as factorizability, for the almostperiodic representatives of the symbol.
more …
By
Bagnoli, Franco; Rechtman, Raul
Post to Citeulike
In these notes we discuss the concept of Lyapunov exponents of cellular automata (CA). We also present a synchronization mechanism for CA. We begin with an introduction to CA, introduce the concept of Boolean derivative and show that any CA has a finite expansion in terms of the Boolean derivatives. The Lyapunov exponents are defined as the rate of exponential growth of the linear part of this expansion using a suitable norm. We then present a simple mechanism for the synchronization of CA and apply it to totalistic onedimensional CA. The CA with a nonzero synchronization threshold exhibit complex nonperiodic space time patterns and vice versa. This synchronization transition is related to directed percolation. The synchronization threshold is strongly correlated to the maximum Lyapunov exponent and we propose approximate relations between these quantities.
more …
