Showing 1 to 100 of 191 matching Articles
Results per page:
Export (CSV)
By
Gafni, Eli; Rajsbaum, Sergio
11 Citations
In roundbyround models of distributed computing processes run in a sequence of (synchronous or asynchronous) rounds. The advantage of the roundbyround approach is that invariants established in the first round are preserved in later rounds. An elegant asynchronous roundbyround shared memory model, is the iterated snapshots model (IS). Instead of the snapshots model where processes share an array m[·] that can be accessed any number of times, indexed by process ID, where P_{i} writes to m[i] and can take a snapshot of the entire array, we have processes share a twodimensional array m[·,·], indexed by iteration number and by process ID, where P_{i} in iteration r writes once to m[r, i] and takes one snapshot of row r, m[r,·]. The IS model lends itself more easily to combinatorial analysis. However, to show that whenever a task is impossible in the IS model the task is impossible in the snapshots model, a simulation is needed. Such a simulation was presented by Borowsky and Gafni in PODC97; namely, it was shown how to take a waitfree protocol for the snapshots model, and transform it into a protocol for the IS model, solving the same task.
In this paper we present a new simulation from the snapshots model to the IS model, and show that it can be extended to work with models stronger that waitfree. The main contribution is to show that the simulation can work with models that have access to certain communication objects, called 01tasks. This extends the result of Gafni, Rajsbaum and Herlihy in DISC’2006 stating that renaming is strictly weaker than set agreement from the IS model to the usual noniterated waitfree read/write shared memory model.
We also show that our simulation works with tresilient models and the more general dependent process failure model of Junqueira and Marzullo. This version of the simulation extends previous results by Herlihy and Rajsbaum in PODC’2010 and DISC’2010 about the topological connectivity of a protocol complex in an iterated dependent process failure model, to the corresponding noniterated model.
more …
By
Bagnoli, Franco; El Yacoubi, Samira; Rechtman, Raúl
6 Citations
An important question to be addressed regarding system control on a time interval [0, T] is whether some particular target state in the configuration space is reachable from a given initial state. When the target of interest refers only to a portion of the spatial domain, we speak about regional analysis. Cellular automata approach have been recently promoted for the study of control problems on spatially extended systems for which the classical approaches cannot be used. An interesting problem concerns the situation where the subregion of interest is not interior to the domain but a portion of its boundary . In this paper we address the problem of regional controllability of cellular automata via boundary actions, i.e., we investigate the characteristics of a cellular automaton so that it can be controlled inside a given region only acting on the value of sites at its boundaries.
more …
By
Czyzowicz, Jurek; Kranakis, Evangelos; Krizanc, Danny; Lambadaris, Ioannis; Narayanan, Lata; Opatrny, Jaroslav; Stacho, Ladislav; Urrutia, Jorge; Yazdani, Mohammadreza
Show all (9)
30 Citations
A set of sensors establishes barrier coverage of a given line segment if every point of the segment is within the sensing range of a sensor. Given a line segment I, n mobile sensors in arbitrary initial positions on the line (not necessarily inside I) and the sensing ranges of the sensors, we are interested in finding final positions of sensors which establish a barrier coverage of I so that the sum of the distances traveled by all sensors from initial to final positions is minimized. It is shown that the problem is NP complete even to approximate up to constant factor when the sensors may have different sensing ranges. When the sensors have an identical sensing range we give several efficient algorithms to calculate the final destinations so that the sensors either establish a barrier coverage or maximize the coverage of the segment if complete coverage is not feasible while at the same time the sum of the distances traveled by all sensors is minimized. Some open problems are also mentioned.
more …
By
Rajsbaum, Sergio
5 Citations
In centralized computing we can compute a function composing a sequence of elementary functions, where the output of the ith function in the sequence is the input to the i + 1st function in the sequence. This computation is done without persistent registers that could store information of the outcomes of these function invocations. In distributed computing, a task is the analogue of a function. An iterated model is defined by some base set of tasks. Processes invoke a sequence of tasks from this set. Each process invokes the i + 1st task with its output from the ith task. Processes access the sequence of tasks, onebyone, in the same order, and asynchronously. Any number of processes can crash. In the most basic iterated model the base tasks are read/write registers. Previous papers have studied this and other iterated models with more powerful base tasks or enriched with failure detectors, which have been useful to prove impossibility results and to design algorithms, due to the elegant recursive structure of the runs. This talk surveys results in this area, contributed mainly by Borowsky, Gafni, Herlihy, Raynal, Travers and the author.
more …
By
Gafni, Eli; Rajsbaum, Sergio
15 Citations
The benefits of developing algorithms via recursion are well known. However, little use of recursion has been done in distributed algorithms, in spite of the fact that recursive structuring principles for distributed systems have been advocated since the beginning of the field. We present several distributed algorithms in a recursive form, which makes them easier to understand and analyze. Also, we expose several interesting issues arising in recursive distributed algorithms. Our goal is to promote the use and study of recursion in distributed algorithms.
more …
By
CastroManzano, José Martín
Intentional reasoning is a form of logical reasoning with a temporalintentional and defeasible nature. By covering conditions of material and formal adequacy we describe a defeasible logic of intention to support the thesis that intentional reasoning is bona fide reasoning.
more …
By
Castañeda, Armando; Herlihy, Maurice; Rajsbaum, Sergio
4 Citations
In the renaming problem, each process in a distributed system is issued a unique name from a large name space, and the processes must coordinate with one another to choose unique names from a much smaller name space.
We show that lower bounds on the solvability of renaming in an asynchronous distributed system can be formulated as a purely topological question about the existence of an equivariant chain map from a “topological disk” to a “topological annulus”. Proving the nonexistence of such a map implies the nonexistence of a distributed renaming algorithm in several related models of computation.
more …
By
Vega, Gerardo
1 Citations
Reducible cyclic codes with exactly two nonzero weights were first studied by T. Helleseth [4] and J. Wolfmann [15]. Later on, G. Vega [11], set forth the sufficient numerical conditions in order that a cyclic code, constructed as the direct sum of two oneweight cyclic codes, has exactly two nonzero weights, and conjectured that there are no other reducible twoweight cyclic codes of this type. In this paper we present a new class of cyclic codes constructed as the direct sum of two oneweight cyclic codes. As will be shown, this new class of cyclic codes is in accordance with the previous conjecture, since its codes have exactly six nonzero weights. In fact, for these codes, we will also give their full weight distribution, showing that none of them can be projective. On the other hand, recently, a family of sixweight reducible cyclic codes and their weight distribution, was presented by Y. Liu, et al. [7]; however it is worth noting that, unlike what is done here, the codes in such family are constructed as the direct sum of three irreducible cyclic codes.
more …
By
Bagnoli, Franco; Rechtman, Raúl
1 Citations
We study the regional masterslave synchronization of a one dimensional probabilistic cellular automaton with two absorbing states. The master acts on the boundary of an interval, the region, of a fixed size. For some values of the parameters, this is enough to achieve synchronization in the region. For other values, we extend the regional synchronization to include a fraction of sites inside the region of interest. We present four different ways of doing this and show which is the most effective one, in terms of the fraction of sites inside the region and the time needed for synchronization.
more …
By
CarrilloPacheco, Jesús; Vega, Gerardo; Zaldívar, Felipe
Using Plücker coordinates we construct a matrix whose columns parametrize all projective isotropic lines in a symplectic space E of dimension 4 over a finite field
$\mathbb{F}_{q}$
. As an application of this construction we explicitly obtain the smallest subfamily of algebrogeometric codes defined by the corresponding LagrangianGrassmannian variety. Furthermore, we show that this subfamily is a class of threeweight linear codes over
$\mathbb{F}_{q}$
of length (q^{4} − 1)/(q − 1), dimension 5, and minimum Hamming distance q^{3} − q.
more …
By
Goltsev, Alexander; Gritsenko, Vladimir; Kussul, Ernst; Baidyk, Tatiana
Show all (4)
1 Citations
We propose an algorithm for finding a set of texture features characterizing the most homogeneous texture area of an input image. The found set of features is intended for extraction of this segment. The algorithm processes any input images in the absence of any preliminary information about the images and, accordingly, without any learning. The essence of the algorithm is as follows. The image is covered with a number of test windows. In each of them, a degree of texture homogeneity is measured. The test window with maximal degree of homogeneity is determined and a representative patch of pixels is detected. The texture features extracted from the detected representative patch is considered as those that best characterize the most homogeneous texture segment. So, the proposed algorithm facilitates solution of the texture segmentation task by providing a segmentation technique with helpful additional information about the analyzed image. A computer program simulating the algorithm has been created. The program is tested on natural grayscale images.
more …
By
BarrónCedeño, Alberto; Sierra, Gerardo; Drouin, Patrick; Ananiadou, Sophia
Show all (4)
13 Citations
The
$C\mbox{}value/NC\mbox{}value$
algorithm, a hybrid approach to automatic term recognition, has been originally developed to extract multiword term candidates from specialised documents written in English. Here, we present three main modifications to this algorithm that affect how the obtained output is refined. The first modification aims to maximise the number of real terms in the list of candidates with a new approach for the stoplist application process. The second modification adapts the
$C\mbox{}value$
calculation formula in order to consider single word terms. The third modification changes how the term candidates are grouped, exploiting a lemmatised version of the input corpus. Additionally, size of candidate’s context window is variable. We also show the necessary linguistic modifications to apply this algorithm to the recognition of term candidates in Spanish.
more …
By
CortésBerrueco, Luis Enrique; Gershenson, Carlos; Stephens, Christopher R.
In this paper three computational models for the study of the evolution of cooperation under cultural propagation are studied: Kin Selection, Direct Reciprocity and Indirect Reciprocity. Two analyzes are reported, one comparing their behavior between them and a second one identifying the impact that different parameters have in the model dynamics. The results of these analyzes illustrate how game transitions may occur depending of some parameters within the models and also explain how agents adapt to these transitions by individually choosing their attachment to a cooperative attitude. These parameters regulate how cooperation can selforganize under different circumstances. The emergence of the evolution of cooperation as a result of the agent’s adapting processes is also discussed.
more …
By
Chávez, Edgar; Dobrev, Stefan; Kranakis, Evangelos; Opatrny, Jaroslav; Stacho, Ladislav; Urrutia, Jorge
Show all (6)
7 Citations
We give an algorithm for constructing a connected spanning subgraphs(panner) of a wireless network modelled as a unit disk graph with nodes of irregular transmission ranges, whereby for some parameter 0 < r ≤ 1 the transmission range of a node includes the entire disk around the node of radius at least r and it does not include any node at distance more than one. The construction of a spanner is distributed and local in the sense that nodes use only information at their vicinity, moreover for a given integer k ≥ 2 each node needs only consider all the nodes at distance at most k hops from it. The resulting spanner has maximum degree at most 3 +
$\frac{6}{\pi r}$
+
$\frac{r+1}{r^{2}}$
, when 0 < r <1 (and at most five, when r = 1). Furthermore it is shown that the spanner is planar provided that the distance between any two nodes is at least
$\sqrt{1r^{2}}$
. If the spanner is planar then for k ≥ 2 the sum of the Euclidean lengths of the edges of the spanner is at most
$\frac{kr+1}{kr1}$
times the sum of the Euclidean lengths of the edges of a minimum weight Euclidean spanning tree.
more …
By
Hernández, Noé; Eder, Kerstin; Magid, Evgeni; Savage, Jesús; Rosenblueth, David A.
Show all (5)
The formal verification of properties of Hidden Markov Models (HMMs) is highly desirable for gaining confidence in the correctness of the model and the corresponding system. A significant step towards HMM verification was the development by Zhang et al. of a family of logics for verifying HMMs, called POCTL*, and its model checking algorithm. As far as we know, the verification tool we present here is the first one based on Zhang et al.’s approach. As an example of its effective application, we verify properties of a handover task in the context of humanrobot interaction. Our tool was implemented in Haskell, and the experimental evaluation was performed using the humanoid robot Bert2.
more …
By
GonzálezCastro, Victor; MacKinnon, Lachlan M.; Pilar Angeles, María
In recent years there has been a growing interest in the research community in the utilisation of alternative data models that abandon the relational record storage and manipulation structure. The authors have already reported experimental considerations of the behaviour of nary Relational, BinaryRelational, Associative and Transrelational models within the context of Data Warehousing [1], [2], [3] to address issues of storage efficiency and combinatorial explosion through data repetition. In this paper we present the results obtained during the industrial usage of BinaryRelational model based DBMS within a reference architectural configuration. These industrial results are similar to the ones obtained during the experimental stage of this research at the University laboratory [4] where improvements on query speed, data load and considerable reductions on disk space are achieved. These industrial tests considered a wide set of industries: Manufacturing, Government, Retail, Telecommunications and Finance.
more …
By
Sakai, T.; Nara, C.; Urrutia, J.
In this paper, we consider the problem of packing two or more equal area polygons with disjoint interiors into a convex body K in E^{2} such that each of them has at most a given number of sides. We show that for a convex quadrilateral K of area 1, there exist n internally disjoint triangles of equal area such that the sum of their areas is at least
$\frac {4n} {4n+1}$
. We also prove results for other types of convex polygons K. Furthermore we show that in any centrally symmetric convex body K of area 1, we can place two internally disjoint ngons of equal area such that the sum of their areas is at least
$\frac {n1}{\pi} sin \frac {\pi}{n1}$
. We conjecture that this result is true for any convex bodies.
more …
By
Rosenblueth, David A.
Tamaki and Sato were perhaps among the first to study folding in logic programs. These authors considered the folding of a single clause (E) using a singleclause definition (D) to give a folded clause (F):
$\frac{B \leftarrow Q (D) A \leftarrow Q\Theta, R (E)}{A \leftarrow B\Theta, R(F)}$
fold
plus some syntactic conditions, later refined by Gardner and Shepherdson. Here and throughout, A and B denote atoms, and Q and R denote tuples of literals.
more …
By
Gafni, Eli; Rajsbaum, Sergio; Herlihy, Maurice
27 Citations
We consider the the relative power of two important synchronization problems: set agreement and renaming. We show that renaming is strictly weaker than set agreement, in a roundbyround model of computation. We introduce new techniques, including previously unknown connections between properties of manifolds and computation, as well as novel “symmetrybreaking” constructions.
more …
By
Aichholzer, Oswin; Cetina, Mario; FabilaMonroy, Ruy; Leaños, Jesús; Salazar, Gelasio; Urrutia, Jorge
Show all (6)
Let P be a simple polygon on the plane. Two vertices of P are visible if the open line segment joining them is contained in the interior of P. In this paper we study the following questions posed in [8,9]: (1) Is it true that every nonconvex simple polygon has a vertex that can be continuously moved such that during the process no vertexvertex visibility is lost and some vertexvertex visibility is gained? (2) Can every simple polygon be convexified by continuously moving only one vertex at a time without losing any internal vertexvertex visibility during the process?
We provide a counterexample to (1). We note that our counterexample uses a monotone polygon. We also show that question (2) has a positive answer for monotone polygons.
more …
By
Sakai, Toshinori; Urrutia, Jorge
1 Citations
Let P be a set of n points such that each of its elements has a unique weight in {1, …,n}. P is called a wpset. A noncrossing polygonal line connecting some elements of P in increasing (or decreasing) order of their weights is called a monotonic path of P. A simple polygon with vertices in P is called monotonic if it is formed by a monotonic path and an edge connecting its endpoints. In this paper we study the problem of finding large monotonic polygons and paths in wpsets. We establish some sharp bounds concerning these problems. We also study extremal problems on the number of monotonic paths and polygons of a wpset.
more …
By
Martínez, Genaro J.; Morita, Kenichi; Adamatzky, Andrew; Margenstern, Maurice
Show all (4)
1 Citations
We study Lifelike cellular automaton rule B2/S2345. This automaton exhibits a chaotic behavior yet capable for purposeful computation. The automaton implements Boolean gates via patterns which compete for the space when propagate in channels. Values of Boolean variables are encoded into two types of patterns — symmetric (False) and asymmetric (True). We construct basic logical gates and elementary arithmetical circuits by simulating logical signals using glider reactions taking place in the channels built of nondestructible still lifes. We design a binary adder of majority gates realised in rule B2/S2345.
more …
By
CruzTechica, Sonia; EscalanteRamirez, Boris
1 Citations
The Hermite transform is introduced as an image representation model for multiresolution image fusion with noise reduction. Image fusion is achieved by combining the steered Hermite coefficients of the source images, then the coefficients are combined with a decision rule based on the linear algebra through a measurement of the linear dependence. The proposed algorithm has been tested on both multifocus and multimodal image sets producing results that exceed results achieved with other methods such as wavelets, curvelets [11], and contourlets [2] proving that our scheme best characterized important structures of the images at the same time that the noise was reduced.
more …
By
Sakai, Toshinori; Urrutia, Jorge
Let S = {s(1), …, s(n) } be a permutation of the integers {1,…, n}. A subsequence of S with elements {s(i_{1}), …, s(i_{k})} is called an increasing subsequence if s(i_{1}) < ⋯ < s(i_{k}); It is called a decreasing subsequence if s(i_{1}) > ⋯ > s(i_{k}). The weight of a subsequence of S, is the sum of its elements. In this paper, we prove that any permutation of {1, …, n} contains an increasing or a decreasing subsequence of weight greater than
$n\sqrt{2n/3}$
.
Our motivation to study the previous problem arises from the following problem: Let P be a set of n points on the plane in general position, labeled with the integers {1, …,n} in such a way that the labels of different points are different. A noncrossing path Π with vertices in P is an increasing path if when we travel along it, starting at one of its endpoints, the labels of its vertices always increase. The weight of an increasing path, is the sum of the labels of its vertices. Determining lower bounds on the weight of the heaviest increasing path a point set always has.
We also study the problem of finding a noncrossing matching of the elements of P of maximum weight, where the weight of an edge with endpoints i, j ∈ P is min{i,j}.
more …
By
Vega, Gerardo
Sufficient conditions for the construction of a twoweight cyclic code by means of the direct sum of two oneweight cyclic codes, were recently presented in [4]. On the other hand, an explicit formula for the number of oneweight cyclic codes, when the length and dimension are given, was proved in [3]. By imposing some conditions on the finite field, we now combine both results in order to give a lower bound for the number of twoweight cyclic codes with composite paritycheck polynomials.
more …
By
Garro, Beatriz Aurora; Vazquez, Roberto Antonio; Rodríguez, Katya
4 Citations
DNA microarrays are a powerful technique in genetic science due to the possibility to analyze the gene expression level of millions of genes at the same time. Using this technique, it is possible to diagnose diseases, identify tumours, select the best treatment to resist illness, detect mutations and prognosis purpose. However, the main problem that arises when DNA microarrays are analyzed with computational intelligent techniques is that the number of genes is too big and the samples are too few. For these reason, it is necessary to apply preprocessing techniques to reduce the dimensionality of DNA microarrays. In this paper, we propose a methodology to select the best set of genes that allow classifying the disease class of a gene expression with a good accuracy using Artificial Bee Colony (ABC) algorithm and distance classifiers. The results are compared against Principal Component Analysis (PCA) technique and others from the literature.
more …
By
Cano, Javier; Tóth, Csaba D.; Urrutia, Jorge
For every n ∈ ℕ, there is a straightline drawing D_{n} of a planar graph on n vertices such that in any crossingfree straightline drawing of the graph, at most O(n^{.4982}) vertices lie at the same position as in D_{n}. This improves on an earlier bound of
$O(\sqrt{n})$
by Goaoc et al. [6].
more …
By
Vega, Gerardo; Vázquez, Carlos A.
3 Citations
A remarkably general result which provides the evaluation of a family of exponential sums was presented by Marko J. Moisio (Acta Arithmetica, 93 (2000) 117119). In this work, we use a particular instance of this general result in order to determine the value distribution of a particular exponential sum. Then, motivated by some new and fresh original ideas of Changli Ma, Liwei Zeng, Yang Liu, Dengguo Feng and Cunsheng Ding (IEEE Trans. Inf. Theory, 571 (2011) 397402), we use this value distribution in order to obtain the weight distribution of a family of reducible cyclic codes. As we will see later, all the codes in this family are nonprojective cyclic codes. Furthermore, they can be identified in a very easy way. In fact, as a byproduct of this easy identification, we will be able to determine the exact number of cyclic codes in a family when length and dimension are given.
more …
By
FrankBolton, Pablo; AlvaradoGonzález, Alicia Montserrat; Aguilar, Wendy; Frauel, Yann
Show all (4)
1 Citations
A robot localization scheme is presented in which a mobile robot finds its position within a known environment through image comparison. The images being compared are those taken by the robot throughout its reconnaissance trip and those stored in an image database that contains views taken from strategic positions within the environment, and that also contain position and orientation information. Image comparison is carried out using a scaledependent keypointmatching technique based on SIFT features, followed by a graphbased outlier elimination technique known as Graph Transformation Matching. Two techniques for position and orientation estimation are tested (epipolar geometry and clustering), followed by a probabilistic approach to position tracking (based on Monte Carlo localization).
more …
By
Urrutia, Jorge
In 1973, Victor Klee posed the following question: How many guards are necessary, and how many are sufficient to patrol the paintings and works of art in an art gallery with n walls?
This wonderfully naïve question of combinatorial geometry has, since its formulation, stimulated a plethora of fun papers, surveys and a book, most of them written in the last twentyfive years. Several variations of the original Art Gallery Problem have appeared, and continue to appear in the literature. In this talk, we will present some recent work motivated by the following problem.
Experience dictates that while trying to locate the best location for a wireless within a building, the main factor that attenuates the signal of a wireless modem, is the number of walls that a signal has to cross. Thus we call a wireless modem (from now on a modem) a kmodem if the signal it transmits is capable of crossing kwalls of a building, and still provide a strong enough signal.
A generalization of Klee’s question is thus: How many kmodems are necessary, and how many are sufficient to cover the interior of an art gallery with nwalls? For k = 0, our problem reduces to the original Art Gallery Problem.
more …
By
Bagnoli, Franco; Dridi, Sara; Yacoubi, Samira; Rechtman, Raúl
Show all (4)
1 Citations
Probabilistic Cellular Automata are extended stochastic systems, widely used for modelling phenomena in many disciplines. The possibility of controlling their behaviour is therefore an important topic. We shall present here an approach to the problem of controlling such systems by acting only on the boundary of a target region.
more …
By
GaliciaHaro, Sofia N.; Gelbukh, Alexander F.
This paper reports research on temporal expressions shaped by a common temporal expression for a period of years modified by an adverb of time. From a Spanish corpus we found that some of those phrases are agerelated expressions. To determine automatically the temporal phrases with such meaning we analyzed a bigger sample obtained from the Internet. We analyzed these examples to define the relevant features to support a learning method. We present some preliminary results when a decision tree is applied.
more …
By
Río, Manuel Beltrán; Cocho, Germinal
We trace the rank size distribution of notes in harmonic music, which on previous works we suggested was much better represented by the Twoparameter, first class Beta distribution than the customary power law, to the ranked mixing of distributions dictated by the harmonic and instrumental nature of the piece. The same representation is shown to arise in other fields by the same type of ranked shuffling of distributions. We include the codon content of intergenic DNA sequences and the ranked distribution of sizes of trees in a determined area as examples. We show that the fittings proposed increase their accuracy with the number of distributions that are mixed and ranked.
more …
By
MurguíaRomero, Miguel; SerranoEstrada, Bernardo; GallardoOrtíz, Itzell A.; JiménezFlores, J. Rafael; VillalobosMolina, Rafael
Show all (5)
Obesity and metabolic syndrome pave the way to type 2 diabetes and cardiovascular disease. Obesity and metabolic syndrome prevalence are high (39% and 13%, respectively) in young Mexican population where feeding habits are one of the main factors leading to these maladies. So, a plausible strategy to address the problem is to suggest young to review and control their feeding habits. We present an internet system to register, selfmonitoring, and assess feeding habits for young Mexican to review and control them. This system is organized in eight questionnaires, one of them about food frequencies. Also anthropometrics data such as weight, height, waist and hip circumference are registered to assess and followup weight condition, through body mass index, and waist/hip ratio. The user could generate a pdf report that automatically summarizes all the data captured in the questionnaires in two pages. A follow up charts of some parameters are also available for monthly data collection once the user fills the system. More than 2,000 young students have used this system since 2013 and is open universally (
www.misalud.abacoac.org
) to all young people wanting to selfmonitoring her/his feeding habits, and obtaining general health recommendations.
more …
By
Rosenblueth, David A.; Muñoz, Stalin; Carrillo, Miguel; Azpeitia, Eugenio
Show all (4)
6 Citations
Boolean networks are important models of gene regulatory networks. Such models are sometimes built from: (1) a gene interaction graph and (2) a set of biological constraints. A gene interaction graph is a directed graph representing positive and negative gene regulations. Depending on the biological problem being solved, the set of biological constraints can vary, and may include, for example, a desired set of stationary states. We present a symbolic, SATbased, method for inferring synchronous Boolean networks from interaction graphs augmented with constraints. Our method first constructs Boolean formulas in such a way that each truth assignment satisfying these formulas corresponds to a Boolean network modeling the given information. Next, we employ a SAT solver to obtain desired Boolean networks. Through a prototype, we show results illustrating the use of our method in the analysis of Boolean gene regulatory networks of the Arabidopsis thaliana root stem cell niche.
more …
By
GonzálezCastro, Victor; MacKinnon, Lachlan M.; Pilar Angeles, María
1 Citations
In the last few years the amount of data stored on computer systems is growing at an accelerated rate. These data are frequently managed within data warehouses. However, the current data warehouse architectures based on naryRelational DBMSs are overcoming their limits in order to efficiently manage such large amounts of data. Some DBMS are able to load huge amounts of data nevertheless; the response times become unacceptable for business users during information retrieval. In this paper we describe an alternative data warehouse reference architectural configuration (ADW) which addresses many issues that organisations are facing. The ADW approach considers a BinaryRelational DBMS as an underlying data repository. Therefore, a number of improvements have been achieved, such as data density increment, reduction of data sparsity, query response times dramatically decreased, and significant workload reduction with data loading, backup and restore tasks.
more …
By
Boland, Ralph P.; Urrutia, Jorge
A polygon Q is tree monotone if, for some highest or lowest point p on Q and for any point q interior to Q, there is a ymonotone curve from p to q whose interior is interior to Q. We show how to partition an n vertex polygon P in θ (n) time into tree monotone subpolygons such that any ymonotone curve interior to P intersects at most two of the subpolygons. We then use this partition to further partition P into ymonotone subpolygons such that the number of subpolygons needed to cover any given ymonotone curve interior to P is O(log n). Our algorithm runs in θ(n) time and space which is an improvement by an O(log n) factor in time and space over the best previous result.
more …
By
Nava, Rodrigo; Marcos, J. Victor; EscalanteRamírez, Boris; Cristóbal, Gabriel; Perrinet, Laurent U.; Estépar, Raúl San José
Show all (6)
4 Citations
In recent years, with the advent of Highresolution Computed Tomography (HRCT), there has been an increased interest for diagnosing Chronic Obstructive Pulmonary Disease (COPD), which is commonly presented as emphysema. Since lowattenuation areas in HRCT images describe different emphysema patterns, the discrimination problem should focus on the characterization of both local intensities and global spatial variations. We propose a novel texturebased classification framework using complex Gabor filters and local binary patterns. We also analyzed a set of global and local texture descriptors to characterize emphysema morphology. The results have shown the effectiveness of our proposal and that the combination of descriptors provides robust features that lead to an improvement in the classification rate.
more …
By
KuriMorales, Angel; AldanaBobadilla, Edwin
9 Citations
Genetic Algorithms (GAs) have long been recognized as powerful tools for optimization of complex problems where traditional techniques do not apply. However, although the convergence of elitist GAs to a global optimum has been mathematically proven, the number of iterations remains a casebycase parameter. We address the problem of determining the best GA out of a family of structurally different evolutionary algorithms by solving a large set of unconstrained functions. We selected 4 structurally different genetic algorithms and a nonevolutionary one (NEA). A schemata analysis was conducted further supporting our claims. As the problems become more demanding, the GAs significantly and consistently outperform the NEA. A particular breed of GA (the Eclectic GA) is superior to all other, in all cases.
more …
By
Neme, Antonio; Miramontes, Pedro
1 Citations
In the traditional selforganizing map (SOM) the best matching unit (BMU) affects other neurons, through the learning rule, as a function of distance. Here, we propose a new parameter in the learning rule so neurons are not only affected by BMU as a function of distance, but as a function of the frequency of activation from both, the BMU and input vectors, to the affected neurons. This frequency parameter allows non radial neighborhoods and the quality of the formed maps is improved with respect to those formed by traditional SOM, as we show by comparing several error measures and five data sets.
more …
By
Fraigniaud, Pierre; Rajsbaum, Sergio; Travers, Corentin
4 Citations
This paper studies notions of locality that are inherent to the specification of a distributed task and independent of the computing environment, in a shared memory waitfree system.
A locality property called projectionclosed is identified, that completely characterizes tasks that are waitfree checkable. A task
$T =(\ensuremath{\mathcal{I}} ,\ensuremath{\mathcal{O}} ,\Delta)$
is checkable if there exists a waitfree distributed algorithm that, given
$s\in\ensuremath{\mathcal{I}} $
and
$t\in\ensuremath{\mathcal{O}} $
, determines whether t ∈ Δ(s), i.e., if t is a valid output for s according to the specification of T. Moreover, determining whether a projectionclosed task is waitfree solvable remains undecidable, and hence this is a rich class of tasks.
A stronger notion of locality considers tasks where the outputs look identically to the inputs at every vertex (input value of a process). A task
$T= (\ensuremath{\mathcal{I}} ,\ensuremath{\mathcal{O}} ,\Delta)$
is said to be localitypreserving if
$\ensuremath{\mathcal{O}} $
is a covering complex of
$\ensuremath{\mathcal{I}} $
. This topological property yields obstacles for waitfree solvability different in nature from the classical agreement impossibility results. On the other hand, localitypreserving tasks are projectionclosed and therefore always waitfree checkable. A classification of localitypreserving tasks in term of their relative computational power is provided. A correspondence between localitypreserving tasks and subgroups of the edgepath group of an input complex shows the existence of hierarchies of localitypreserving tasks, each one containing at the top the universal task (induced by the universal covering complex), and at the bottom the trivial identity task.
more …
By
Du, Jingzhe; Kranakis, Evangelos; Ponce, Oscar Morales; Rajsbaum, Sergio
Show all (4)
4 Citations
Consider a network of n directional antennae in the plane. We consider the problem of efficient neighbor discovery in a (synchronous) network of sensors employing directional antennae. In this setting sensors send messages and listen for messages by directing their antennae towards a specific direction (which is not necessarily known in advance). In our model the directional antennae can be rotated by the sensors as required so as to discover all neighbors in their vicinity. In this paper we will limit ourselves to the (D,D) communication model whereby sensors employ directional antennae with identical transmission/reception beam widths. Our methodology is based on techniques for symmetry breaking so as to enable sender/receiver communication. We provide 1) deterministic algorithms that introduce delay in the rotation of the antennae and exploit knowledge of the existence of a vertex coloring of the network, and 2) randomized algorithms that require knowledge only of an upper bound on the size of the network so as to accomplish neighbor discovery. In both instances we study tradeoffs on the efficiency of the algorithms proposed.
more …
By
Carrillo, Miguel; Rosenblueth, David A.
1 Citations
We present a recursive algorithm to update a Kripke model so as to satisfy a formula of the ComputationTree Logic (CTL). Recursive algorithms for model update face a difficulty: deleting (adding) transitions from (to) a Kripke model to satisfy a universal (an existential) subformula may dissatisfy some existential (universal) subformulas. Our method employs protected models to overcome this difficulty. We demonstrate our algorithm with a classical example of automatic synthesis described by Emerson and Clarke in 1982. From a dummy model, where the accessibility relation is the identity relation, our algorithm can efficiently generate a model to satisfy a specification of mutual exclusion in a variant of CTL. Such a variant extends CTL with an operator that limits the outdegree of states. We compare our method with a generateandtest algorithm and outline a proof of soundness and completeness for our method.
more …
By
Salomão, Roberta C. S.; Rebelo, Francisco; Rodríguez, Fernando Gamboa
This work is part of a user centered design development research with 106 international university students from 27 nationalities who were living in Brazil, Mexico and Portugal, which resulted in the game design and programming of the 3D Adventure Game to learn Foreign Languages “Back to the knowledge – the foreign student and the book of language and culture”. With the proposal of exploring a new approach of learning a foreign language in autonomy through an educational game and its development according to the needs and expectations of its targeted audience; this two years project started in 2014 by conducting interviews for creation of university students’ Personas. This article presents the last users evaluations in the perspective of PlayPersonas. The quantitatively and qualitative analysis of ten students play ways compiled player preferences and behaviors in terms of interaction and navigation in five PlayPersonas and allowed to contrast them to the initial Persona.
more …
By
Fraigniaud, Pierre; Rajsbaum, Sergio; Travers, Corentin
15 Citations
This paper studies notions of locality that are inherent to the specification of distributed tasks by identifying fundamental relationships between the various scales of computation, from the individual process to the whole system. A locality property called projectionclosed is identified. This property completely characterizes tasks that are waitfree checkable, where a task
$$T =(\mathcal{I },\mathcal{O },\varDelta )$$
is said to be checkable if there exists a distributed algorithm that, given
$$s\in \mathcal{I }$$
and
$$t\in \mathcal{O }$$
, determines whether
$$t\in \varDelta {(s)}$$
, i.e., whether
$$t$$
is a valid output for
$$s$$
according to the specification of
$$T$$
. Projectionclosed tasks are proved to form a rich class of tasks. In particular, determining whether a projectionclosed task is waitfree solvable is shown to be undecidable. A stronger notion of locality is identified by considering tasks whose outputs “look identical” to the inputs at every process: a task
$$T= (\mathcal{I },\mathcal{O },\varDelta )$$
is said to be localitypreserving if
$$\mathcal{O }$$
is a covering complex of
$$\mathcal{I }$$
. We show that this topological property yields obstacles for waitfree solvability different in nature from the classical impossibility results. On the other hand, localitypreserving tasks are projectionclosed, and thus they are waitfree checkable. A classification of localitypreserving tasks in term of their relative computational power is provided. This is achieved by defining a correspondence between subgroups of the edgepath group of an input complex and localitypreserving tasks. This correspondence enables to demonstrate the existence of hierarchies of localitypreserving tasks, each one containing, at the top, the universal task (induced by the universal covering complex), and, at the bottom, the trivial identity task.
more …
By
Martínez, Genaro J.; Adamatzky, Andrew; McIntosh, Harold V.
4 Citations
We study a twodimensional cellular automaton (CA), called Diffusion Rule, which exhibits diffusionlike dynamics of propagating patterns. In computational experiments we discover a wide range of mobile and stationary localizations (gliders, oscillators, glider guns, puffer trains), analyze spatiotemporal dynamics of collisions between gliders, and discuss possible applications in unconventional computing.
more …
By
Ita, Guillermo; Bello, Pedro; Contreras, Meliza; CatanaSalazar, Juan C.
Show all (4)
We present a method to compute the number of independent sets for two basic graph patterns: simple cycles and trees. We show how to extend this initial method for processing efficiently more complex graph topologies.
We consider polygonal array graphs that are a graphical representation of molecular compounds. We show how our method processes polygonal trees based on a 2treewidth decomposition of the input graph.
A pair of macros is associated to each basic graphic pattern in this decomposition, allowing us to represent and perform a series of repetitive operations while the same pattern graph is found. This results in a lineartime algorithm for counting its number of independent sets.
more …
By
MartínezDíaz, Saúl; MartínezChavelas, Saúl
Geometrical distortions are a major problem in image recognition. Composite correlation filters can be used for distortioninvariant image recognition by incorporating rotated versions of the target object. Traditionally composite filters are designed with linear techniques; but, these filters are sensitive to nonGaussian noise. On the other hand, for the same purpose, composite nonlinear filters have been proposed too. These filters have a good discrimination capability and they are robust to nonGaussian noise and illumination changes; however, the performance of filter could be reduced when the number of training images incorporated increases. In this paper, we propose a method for designing rotationinvariant composite nonlinear filters. The method tries to maximize the number of objects incorporated into the filter and preserve its performance.
more …
By
Miranda, Natalia; Chávez, Edgar; Piccoli, María Fabiana; Reyes, Nora
Show all (4)
5 Citations
Proximity queries consists in retrieving objects near a given query. To avoid a brute force scan over a large database, an index can be used. However, for some problems, indexes are mostly useless (their running times are worst than sequential scan).
On the other hand, researchers have tried massively parallel hardware (as GPGPU) in the quest of faster query times. The results have been modest because current algorithms are cumbersome, while GPGPU architectures favor simple kernels, have a clear memory hierarchy and need close to zero crosstalk between processing units. We have engineered very fast algorithms for proximity queries taking into account this principles, all of them are presented in this paper.
In our approach no index is built, the crosstalk between threads is eliminated, and the higher (faster) levels of memory hierarchy are consistently used. The absence of data structures allows to use all the available memory for the database, and furthermore makes possible to do stream processing on very large data collections.
more …
By
Zúñiga, Angel; Sierra, Gerardo; BelEnguix, Gemma; GaliciaHaro, Sofía N.
Show all (4)
Being able to create a natural language compiler has been one of the most soughtafter goals to reach since the very beginning of artificial intelligence. Since then; however, it has been an elusive and difficult task to achieve to the extent of being considered almost impossible to perform. In this article, we present a promising path by using a grammar formalism which attempts to model natural language; in principle, by using minimalist grammars as one of the last proposed instances of formalism of this type. The main idea consists in creating a parser based on this type of grammars which could recognize and analyze the text (or input program) written in natural language and use this parser as a frontend of a compiler. Then, for the rest of the compilation process, utilize the usual phases of a classic compiler of a programming language. Moreover, we present a prototype of a natural language compiler whose specific language is that of arithmetic expressions, in order to show with evidence that it is indeed possible to implement it, that is to say, to put the proposed compiler design into practice, showing in this manner that it is actually possible to create a natural language compiler following this promising path.
more …
By
Edmonds, Bruce; Gershenson, Carlos
The Turing Test checks for human intelligence, rather than any putative general intelligence. It involves repeated interaction requiring learning in the form of adaption to the human conversation partner. It is a macrolevel posthoc test in contrast to the definition of a Turing machine, which is a prior microlevel definition. This raises the question of whether learning is just another computational process, i.e., can be implemented as a Turing machine. Here we argue that learning or adaption is fundamentally different from computation, though it does involve processes that can be seen as computations. To illustrate this difference we compare (a) designing a Turing machine and (b) learning a Turing machine, defining them for the purpose of the argument. We show that there is a welldefined sequence of problems which are not effectively designable but are learnable, in the form of the bounded halting problem. Some characteristics of human intelligence are reviewed including it’s: interactive nature, learning abilities, imitative tendencies, linguistic ability and contextdependency. A story that explains some of these is the Social Intelligence Hypothesis. If this is broadly correct, this points to the necessity of a considerable period of acculturation (social learning in context) if an artificial intelligence is to pass the Turing Test. Whilst it is always possible to ‘compile’ the results of learning into a Turing machine, this would not be a designed Turing machine and would not be able to continually adapt (pass future Turing Tests). We conclude three things, namely that: a purely “designed” Turing machine will never pass the Turing Test; that there is no such thing as a general intelligence since it necessarily involves learning; and that learning/adaption and computation should be clearly distinguished.
more …
By
Kranakis, Evangelos; MacQuarie, Fraser; MoralesPonce, Oscar; Urrutia, Jorge
Show all (4)
Assume that n directional antennae located at distinct points in the plane are rotating at constant identical speeds. They all have identical range and sensor angle (or field of view). We propose and study the Rotating Antennae Coverage Problem, a new problem concerning rotating sensors for the uninterrupted coverage of a region in the plane. More specifically, what is the initial orientation of the sensors, minimum angle, and range required so that a given (infinite or finite) line or planar domain is covered by the rotating sensors at all times? We give algorithms for determining the initial orientation of the sensors and analyze the resulting angle/range tradeoffs for ensuring continuous coverage of a given region or line in the plane with identical rotating sensors of given transmission angle and range. We also investigate other variants of the problem whereby for a given parameter T (representing time) there is no point in the domain that is left unattended by some sensor for a period of time longer than T. Despite the apparent simplicity of the problem several of the algorithms proposed are intricate and elegant. We have also implemented our algorithms in C++ and the code can be downloaded on the web.
more …
By
Bagnoli, Franco; El Yacoubi, Samira; Rechtman, Raul
An interesting problem in extended physical systems is that of the regional control, i.e., how to add a suitable control at the boundary or inside a region of interest so that the state of such region is near to a desired one. Many physical problems are modelled by means of cellular automata. It is therefore important to port control concepts to this discrete world. In this paper we address the problem of regional controllability of cellular automata via boundary actions, i.e., we investigate the characteristics of a cellular automaton rule so that it can be controlled inside a given region only acting on the value of sites at its boundaries.
more …
By
FuentesPineda, Gibran; MezaRuíz, Ivan Vladimir
We present Sampled Weighted MinHashing (SWMH), a randomized approach to automatically mine topics from largescale corpora. SWMH generates multiple random partitions of the corpus vocabulary based on term cooccurrence and agglomerates highly overlapping interpartition cells to produce the mined topics. While other approaches define a topic as a probabilistic distribution over a vocabulary, SWMH topics are ordered subsets of such vocabulary. Interestingly, the topics mined by SWMH underlie themes from the corpus at different levels of granularity. We extensively evaluate the meaningfulness of the mined topics both qualitatively and quantitatively on the NIPS (1.7 K documents), 20 Newsgroups (20 K), Reuters (800 K) and Wikipedia (4 M) corpora. Additionally, we compare the quality of SWMH with Online LDA topics for document representation in classification.
more …
By
VegaAlvarado, Leticia; OrtízPosadas, Martha R.
The paper presents a model to analyze data structured in classes, to determine their representativity and classification. The model includes an algorithm integrating three parameters: InformationalWeight, DifferentialWeight and TipicityContrast. In application we analyze clinical data on 160 patients with lip and palate malformations. The model allows to assess how representative the sample is, using the variables of the cleft, lip and nose along with some expertly determined comparison criteria. Moreover using the TipicityContrast parameter a supervised classification was achieved and has been able to classify correctly, in average, a 93% of the patients. As a result this model can provide helpful auxiliary criteria in medical decisionmaking.
more …
By
Aguilar, Wendy; Pineda, Luis A.
1 Citations
In this paper we present the integration of graphbased visual perception to spoken conversation in humanrobot interaction. The proposed architecture has a dialogue manager as the central component for the multimodal interaction, which directs the robot’s behavior in terms of the intentions and actions associated to the conversational situations. We tested this ideas on a mobile robot programmed to act as a visitor’s guide to our department of computer science.
more …
By
Altamirano, Carlo; Robledo, Alberto
2 Citations
We demonstrate that the laws of Zipf and Benford, that govern scores of data generated by many and diverse kinds of human activity (as well as other data from natural phenomena), are the centerpiece expressions of a generalized thermodynamic structure. This structure is obtained from a deformed type of statistical mechanics that arises when configurational phase space is incompletely visited in an especially severe fashion. Specifically, the restriction is that the accessible fraction of this space has fractal properties. We obtain a generalized version of Benford’s law for data expressed in full and not by the first digit. The inverse functional of this expression is identified with the Zipf’s law; but it naturally includes the tails observed in real data for small rank. Thermodynamically, our version of Benford’s law expresses a Legendre transform between two entropy (or Massieu) potentials, while Zipf’s law is merely the expression that relates the corresponding partition functions.
more …
By
GaliciaHaro, Sofia N.; Gelbukh, Alexander F.
In this article we present a simple method to extract Spanish nouns with the linguistic property of “human” animacy. We describe a nonsupervised method based on lexical patterns and on a person name list enlarged from a collection of newspaper texts. Results were obtained from the Web filters and estimation methods are proposed to validate them.
more …
By
Martínez, Genaro J.; Adamatzky, Andrew; Morita, Kenichi; Margenstern, Maurice
Show all (4)
5 Citations
We study a Lifelike cellular automaton rule B2/S2345 where a cell in state ‘0’ takes state ‘1’ if it has exactly two neighbors in state ‘1’ and the cell remains in the state ‘1’ if it has between two and five neighbors in state ‘1.’ This automaton is a discrete analog spatially extended chemical media, combining both properties of subexcitable and precipitating chemical media. When started from random initial configuration B2/S2345 automaton exhibits chaotic behavior. Configurations with low density of state ‘1’ show emergence of localized propagating patterns and stationary localizations. We construct basic logical gates and elementary arithmetical circuits by simulating logical signals with mobile localizations reaction propagating geometrically restricted by stationary nondestructible localizations. Values of Boolean variables are encoded into two types of patterns — symmetric (False) and asymmetric (True) patterns — which compete for the ‘empty’ space when propagate in the channels. Implementations of logical gates and binary adders are illustrated explicitly.
more …
By
Chevalier, Margarita; Chanes, Lorena; Guibelalde, Eduardo; Brandan, MaríaEster; Alieva, Tatiana
Show all (5)
Xray field phase differences caused by an object can induce edge enhancement in a radiological image. Phase contrast effects have been interpreted using a model based on xray Fresnel diffraction. The dependence of edgeenhancement on xray energy, object characteristics and magnification has been systematically studied and interpreted by calculations using this model. It was found a good agreement between numerical simulations and experimental results obtained under magnification conditions with a commercial digital mammography unit. The numerical calculations have been extended to the more general situation of a variable total sourcetodetector distance. In this paper, we present the results of these calculations as well as the results derived from the analysis of the influence of the fiber radius on the edge enhancement.
more …
By
Góngora, Pedro Arturo; Rosenblueth, David A.
1 Citations
The gametheoretic approach to multiagent systems has been incorporated into the modelchecking agenda by using temporal and dynamic logic to characterize notions such as Nash equilibria. Recent efforts concentrate on purestrategy games, where intelligent agents act deterministically guided by utility functions. We build upon this tradition by incorporating stochastic actions. First, we present an extension of the Probabilistic ComputationTree Logic (PCTL) to quantify and compare expected costs. Next, we give a discretetime Markov chain codification for mixedstrategy games. Finally, we characterize mixedstrategy Nash equilibria.
more …
By
Lira, Jorge; Marín, Erick
1 Citations
Climate change has produced transformations in the coastal zone of Tamaulipas State. Such changes include modifications to coastline and transformations to texturerelief and texture of the zone. In this work, high resolution panchromatic SPOT images have been employed to quantify such modifications. A synthetic multispectral image is used to validate our results. To quantify the texturerelief and texture, the multispectral image is modeled as a vector field of as many dimensions as bands of the image. Upon this field, the vector operators divergence and laplacian are applied. Results are presented for an area of TampicoAltamira, details of the methodology are shown and results are discussed.
more …
By
Lugo, Igor; Gershenson, Carlos
Historical evidence in some regions of Latin America has suggested that the system of ancient routes between places could have determined the success or collapse of prehistoric societies. The identification of such routes provided essential information to understand initial conditions in the evolution of the actual road infrastructure. Looking into the increasing technology applied to generate and process geospatial information, we proposed a retrospective spatial analysis for discovering a largescale network of ancient routes before the conquest of Aztecs by the Spanish around 1520 CE. Such a method consisted in analyzing existing road networks (highways) that connect a system of cities (continuously builtup areas) to deduce routes by using geoprocessing methods, network analysis, and historical evidence. The results of this research support the idea that the retrospective method may be applied to other cases to decipher and to understand initial conditions in the evolution of road infrastructures by combining different types of data and scientific fields.
more …
By
Nava, Rodrigo; González, Germán; Kybic, Jan; EscalanteRamírez, Boris
Show all (4)
1 Citations
Colorectal cancer is a major cause of mortality. As the disease progresses, adenomas and their surrounding tissue are modified. Therefore, a large number of samples from the epithelial cell layer and stroma must be collected and analyzed manually to estimate the potential evolution and stage of the disease. In this study, we propose a novel method for automatic classification of tumor epithelium and stroma in digitized tissue microarrays. To this end, we use discrete Tchebichef moments (DTMs) to characterize tumors based on their textural information. DTMs are able to capture image features in a nonredundant way providing a unique description. A support vector machine was trained to classify a dataset composed of 1376 tissue microarrays from 643 patients with colorectal cancer. The proposal achieved 97.62 % of sensitivity and 95 % of specificity showing the effectiveness of the methodology.
more …
By
CedilloHernandez, Antonio; CedilloHernandez, Manuel; GarciaUgalde, Francisco; NakanoMiyatake, Mariko; PerezMeana, Hector
Show all (5)
In this paper we propose a watermarking scheme to ensure copyright protection in video distribution systems. This research is focused on provide a low computational cost solution that allows embedding a robust and imperceptible watermarking signal in the video content, such that the original video transmission rate will not be affected. Our proposal accomplishes this challenge by introducing a fast and effective method based on spatiotemporal Human Visual System properties that allow improving imperceptibility along video sequence and at the same time, enhancing robustness. Extensive experiments were performed to prove that the proposed algorithm satisfies quickness, invisibility and robustness against several realworld video processing issues. The proposed scheme has the advantages of robustness, simplicity and flexibility, allowing it to be considered a good solution in practical situations since it can be adapted to different video distribution systems.
more …
By
GómezTéllez, J. David; LluisPuebla, Emilio; Montiel, Mariana
A 20note scale is revisited (from Balzano and Zweifel) and endowed with a version of Mazzola’s theory of modulation based on the symmetry group of the scale. Mazzola’s theory has been applied also by Muzzulini in the context of the usual 12note equally tempered chromatic scale. A modulation for a 7 note exotic scale, based on this model, is presented in Sect. to exemplify the algorithm by which the modulation quanta are computed, that is, the sets of notes that permit the calculation of the pivot progressions that lead from one scale to another. Then, the modulation model based on symmetries is applied to a 11 note diatonic scale, immersed within the 20 note scale, which shows the viability of the symmetry model for this microtonal case. This work is based on the premise that musical expression has an underlying mathematical structure.
more …
By
Padilla, Pablo; Knights, Francis; Ruiz, Adrián Tonatiuh; Tidhar, Dan
Show all (4)
The problem of identifying musical styles using mathematical tools is central not only in musicology and the mathematical theory of music, but also in applications to music pattern recognition and automated music generation in a particular idiom. In this paper we propose a methodology related to the transition network approach developed by D. Cope in his Experiments on Musical Intelligence, EMI. This extension allows for the possibility of defining stylistic cells at different scales as motifs and moduli of networks at the corresponding scale. It can be applied to study recursivity aspects of music. We also outline how this methodology can be used to systematically study stylistic changes in different contexts by incorporating probabilistic and statistical tools and connections with other approaches.
more …
By
Herlihy, Maurice; Rajsbaum, Sergio
7 Citations
Roughly speaking, a simplicial complex is shellable if it can be constructed by gluing a sequence of nsimplexes to one another along
$$(n1)$$
faces only. Shellable complexes have been widely studied because they have nice combinatorial properties. It turns out that several standard models of concurrent computation can be constructed from shellable complexes. We consider adversarial schedulers in the synchronous, asynchronous, and semisynchronous messagepassing models, as well as asynchronous shared memory. We show how to exploit their common shellability structure to derive new and remarkably succinct tight (or nearly so) lower bounds on connectivity of protocol complexes and hence on solutions to the
$$k$$
set agreement task in these models. Earlier versions of material in this article appeared in the 2010 ACM Symposium on Principles of Distributed Computing (Herlihy and Rajsbaum 2010), and the International Conference on Distributed Computing (Herlihy and Rajsbaum 2010, doi:
10.1145/1835698.1835724
).
more …
By
Ruiz, Guillermo; Santoyo, Francisco; Chávez, Edgar; Figueroa, Karina; Tellez, Eric Sadit
Show all (5)
7 Citations
Pivot tables are popular for exact metric indexing. It is well known that a large pivot table produces faster indexes. The rule of thumb is to use as many pivots as the available memory allows for a given application. To further speedup searches, redundant pivots can be eliminated or the scope of the pivots (the number of database objects covered by a pivot) can be reduced.
In this paper, we apply a different technique to speedup searches. We assign objects to pivots while, at the same time, enforcing proper coverage of the database objects. This increases the discarding power of pivots and in turn leads to faster searches. The central idea is to select a set of essential pivots (without redundancy) covering the entire database. We call our technique extreme pivoting (EP).
A nice additional property of EP is that it balances performance and memory usage. For example; using the same amount of memory, EP is faster than the List of Clusters and the Spatial Approximation Tree. Moreover, EP is faster than LAESA even when it uses less memory.
The EP technique was formally modeled allowing performance prediction without an actual implementation. Performance and memory usage depend on two parameters of EP, which are characterized with a wide range of experiments. Also, we provide automatic selection of one parameter fixing the other. The formal model was empirically tested with real world and synthetic datasets finding high consistency between the predicted and the actual performance.
more …
By
MurguíaRomero, Miguel; MéndezCruz, René; VillalobosMolina, Rafael; RodríguezSoriano, Norma Yolanda; GonzálezDalhaus, Estrella; JiménezFlores, Rafael
Show all (6)
1 Citations
A knowledge based system to identify 10 main metabolic alterations in university students based on clinical and anthropometric parameters is presented. Knowledge engineering was carried out through unstructured expert interviews methodology, resulting in a knowledge base of 17 IFTHEN rules. A backward chaining machine engine was built in Prolog language; the attributevalues database about parameters of each student was also stored in Prolog facts. The system was applied to 592 cases: clinical and anthropometric parameters of the students stored in the database. Medical diagnoses and recommendations for each student, obtained from the system, were organized in individualized reports that the physicians gave to the students in personal interviews along only two days. The effectiveness of these interviews is largely attributed to the fact that physicians are the same experts who participated in the process of building the knowledge base.
more …
By
Ramírez, Arturo; Acuña, Francisco González; Romero, Alejandro González; Alquézar, René; Hernández, Enric; Aguilar, Amador Roldán; Olmedo, Ian García
Show all (7)
1 Citations
The game of Scrabble, in its competitive form (one vs. one), has been tackled mostly by using Monte Carlo simulation. Recently [1], Probability Theory (Bayes’ theorem) was used to gain knowledge about the opponents’ tiles; this proved to be a good approach to improve even more Computer Scrabble. We used probability to evaluate Scrabble leaves (rack residues); then using this evaluation, a heuristic function that dictates a move can be constructed. To calculate these probabilities it is necessary to have a lexicon, in our case a Spanish lexicon. To make proper investigations in the domain of Scrabble it is important to have the same lexicon as the one used by humans in official tournaments. We did a huge amount of work to build this free lexicon. In this paper a heuristic function that involves leaves probabilities is given. We have now an engine, Heuri, that uses this heuristic, and we have been able to perform some experiments to test it. The tests include matches against highly expert players; the games played so far give us promising results. For instance, recently a match between the current World Scrabble Champion (in Spanish) and Heuri was played. Heuri defeated the World Champion 60 ! Heuri includes a move generator which, using a lot of memory, is faster than using DAWG [2] or GADDAG [3]. Another plan to build a stronger Heuri that combines heuristics using probabilities, opponent modeling and Monte Carlo simulation is also proposed.
more …
By
Pérez y Pérez, Rafael; Guerrero Román, Iván
This paper describes a computer agent for the automatic generation of visual compositions based on the EngagementReflection Model of creative writing (Pérez y Pérez Cogn. Syst. Res. 8, 89–109, 2007; Pérez y Pérez and Sharples J. Exp. Theor. Artif. Intell. 13, 119–139, 2001). During engagement the system progresses the composition; during reflection the agent evaluates, and if necessary modifies, the material produced so far and generates a set of guidelines that constrains the production of material during engagement. The final output is the result of a constant interplay between these two states. We offer details of the model and describe a prototype that provides the users with the possibility of adding compositions to the knowledgebase. Then, we show how through engagement and reflection cycles, the system is capable of generating novel outputs. Using a questionnaire, we asked a group of volunteers to describe the features of pieces produced by the program and the features of pieces produced by human designers. The results suggest that our agent provides an adequate novel framework to study the generation of automatic visual compositions.
more …
By
Nara, Chie; Sakai, Toshinori; Urrutia, Jorge
1 Citations
A geometric graph G is a graph whose vertex set is a set P_{n} of n points on the plane in general position, and whose edges are straight line segments (which may cross) joining pairs of vertices of G. We say that G contains a convex rgon if its vertex and edge sets contain, respectively, the vertices and edges of a convex polygon with r vertices. In this paper we study the following problem: Which is the largest number of edges that a geometric graph with n vertices may have in such a way that it does not contain a convex rgon? We give sharp bounds for this problem. We also give some bounds for the following problem: Given a point set, how many edges can a geometric graph with vertex set P_{n} have such that it does not contain a convex rgon?
A result of independent interest is also proved here, namely: Let P_{n} be a set of n points in general position. Then there are always three concurrent lines such that each of the six wedges defined by the lines contains exactly
$\lfloor \frac{n}{6} \rfloor$
or
$\lceil \frac{n}{6} \rceil$
elements of P_{n}.
more …
By
Akiyama, J.; Kaneko, A.; Kano, M.; Nakamura, G.; RiveraCampo, E.; Tokunaga, S.; Urrutia, J.
Show all (7)
4 Citations
In this paper we study the following problem: how to divide a cake among the children attending a birthday party such that all the children get the same amount of cake and the same amount of icing. This leads us to the study of the following. A perfect kpartitioning of a convex set S is a partitioning of S into k convex pieces such that each piece has the same area and
$\frac{1}{k}$
of the perimeter of S . We show that for any k, any convex set admits a perfect kpartitioning. Perfect partitionings with additional constraints are also studied.
more …
By
LaureanoCruces, Ana Lilia; CruzGonzález, José Manuel; RamírezRodríguez, Javier; SolanoGonzález, Julio
Show all (4)
One of the distributed artificial intelligence objectives is the decentralization of control. Multiagent architectures distribute the control among two or more agents, which will be in charge of different events. In this paper the design of a parallel Multiagent architecture for genetic algorithms is described, using a bottomup behavior design and reactive agents. Such design tries to achieve the improvement solution of parallel genetic algorithms. The purpose of incorporating a reactive behavior in the parallel genetic algorithms is to improve the overall performance by updating the subpopulations according to the general behavior of the algorithm avoiding getting stuck in local minima. Two kinds of experiments were conducted for each one of the algorithms and the results obtained with both experiments are shown.
more …
By
Mateos, MaríaJulieta; Gastelum, Alfonso; Márquez, Jorge; Brandan, MariaEster
Show all (4)
1 Citations
A texture analysis aimed at finding correlations between textural descriptors and lesion diagnosis was applied to ContrastEnhanced Digital Mammography (CEDM) subtracted images acquired under singleenergy temporal subtraction modality using iodinebased contrast medium. The study, based on textural descriptors from Gray Level Cooccurrence Matrix (GLCM), included 68 CEDM images of 17 patients, 10 cancer and 7 benign, acquired 1 to 5 min after iodine injection. Seventeen GLCM descriptors were analyzed. Image processing consisted of geometric registration, logarithmic subtraction, and selection of regionsofinterest (adipose, glandular and lesion ROIs) by the radiologist. Results for lesion ROIs showed that homogeneity, normalized homogeneity, secondorder inverse moment, energy and inverse variance were insensitive to the presence of iodine; a linear correlation existed between the sum mean and mean pixel value. Logistic regression showed that a linear combination of entropy and diagonal momentum discriminated between malignant and benign lesions with 79 % specificity, 93 % sensitivity and 87 % accuracy.
more …
By
Serrano, Sergio A.; BenítezJimenez, Ricardo; NuñezRosas, Laura; Coro Arizmendi, Ma; Greeney, Harold; ReyesMeza, Veronica; Morales, Eduardo; Escalante, Hugo Jair
Show all (8)
The analysis of natural images has been the topic of research in uncountable articles in computer vision and pattern recognition (e.g., natural images has been used as benchmarks for object recognition and image retrieval). However, despite the research progress in such field, there is a gap in the analysis of certain type of natural images, for instance, those in the context of animal behavior. In fact, biologists perform the analysis of natural images manually without the aid of techniques that were supposedly developed for this purpose. In this context, this paper presents a study on automated methods for the analysis of natural images of hummingbirds with the goal to assist biologists in the study of animal behavior. The automated analysis of hummingbird behavior is challenging mainly because of (1) the speed at which these birds move and interact; (2) the unpredictability of their trajectories; and (3) its camouflage skills. We report a comparative study of two deep learning approaches for the detection of hummingbirds in their nest. Two variants of transfer learning from convolutional neural networks (CNNs) are evaluated in real imagery for hummingbird behavior analysis. Transfer learning is adopted because not enough images are available for training a CNN from scratch, besides, transfer learning is less time consuming. Experimental results are encouraging, as acceptable classification performance is achieved with CNNbased features. Interestingly, a pretrained CNN without fine tunning and a standard classifier performed better in the considered data set.
more …
By
Bañuelos, Cristian; Orduña, Felipe
Music information retrieval techniques are used to automatically extract structural data of a piece, however there have been few attempts to study ways to automatically identify the musical form of digital files. In this work we present an implementation of the dynamic time warping algorithm for the automatic identification of musical form structure by means of a segmentation matrix in which we group elements according to maximal similarity. The system was implemented in symbolic files parsed with the music21 library. We tested it in two pieces: Bagatelle No. 25 in A minor by L.V. Beethoven, and Piano Sonata No. 11 in A major, K331, movement 3 by W.A. Mozart. The system obtained a correct identification of the similar sections, both with a rondo form. We foresee that this algorithm can be extended to measure harmonic similarity and with this be able to analyze more complex forms, like a sonata.
more …
By
Lourdes Arredondo, María; Tang, Yu; Rodríguez, Angel Luís; Santillán, Saúl; Chávez, Rafael
Show all (5)
In this paper we presents an observer based on artificial neurofuzzy networks approach to estimate the indicated torque of a diesel engine from crank shaft angular position and velocity measurements. These variables can be measured by lowcost sensors, since the indicated torque is an important signal for monitoring and/or control of a diesel engine; however, it is not practical to measure it, due to it is not easily measured and need expensive sensors. A model of average value of a diesel engine is used in the simulation to test the estimator of the indicated torque, these results are presented. This estimator may be useful in the implementation of control strategies or diagnostic where the indicated torque measurements are required.
more …
By
Esquivel Flores, Oscar A.; Benítez Pérez, Héctor
This paper provides a strategy to schedule a real time distributed system. Modifications on frequency transmission (task periods) of system’s individual components impact on system quality performance. In this work the authors propose a dynamic linear time invariant model based upon frequency transmission of agents in a distributed system and using LQR control approach to bring the system into a nonlinear region. Numerical simulations show the effectiveness of the LQR feedback control law to modify agent’s frequency transmission.
more …
By
Beltrán, Gloria; Romo, Miguel
The structural condition of pavements can be evaluated properly by nondestructive surface deflection testing. Based on measured deflection responses of pavements to impact load, it is possible to estimate layer moduli through back analyses. For that purpose, typical constant values of Poisson’s ratio are commonly assumed for each layer material. In this work a thorough investigation to assess Poisson´s ratio influence on pavements response modeling is carried out. To this end, Artificial Neural Networks are proposed to Poisson´s ratio estimation from deflection testing data. A comparative analysis of pavement responses obtained under constant and variable conditions of Poisson’s ratio is performed.
more …
By
KuriMorales, Angel; TrejoBaños, Daniel; CortesBerrueco, Luis Enrique
1 Citations
The problem of finding clusters in arbitrary sets of data has been attempted using different approaches. In most cases, the use of metrics in order to determine the adequateness of the said clusters is assumed. That is, the criteria yielding a measure of quality of the clusters depends on the distance between the elements of each cluster. Typically, one considers a cluster to be adequately characterized if the elements within a cluster are close to one another while, simultaneously, they appear to be far from those of different clusters. This intuitive approach fails if the variables of the elements of a cluster are not amenable to distance measurements, i.e., if the vectors of such elements cannot be quantified. This case arises frequently in real world applications where several variables (if not most of them) correspond to categories. The usual tendency is to assign arbitrary numbers to every category: to encode the categories. This, however, may result in spurious patterns: relationships between the variables which are not really there at the offset. It is evident that there is no truly valid assignment which may ensure a universally valid numerical value to this kind of variables. But there is a strategy which guarantees that the encoding will, in general, not bias the results. In this paper we explore such strategy. We discuss the theoretical foundations of our approach and prove that this is the best strategy in terms of the statistical behavior of the sampled data. We also show that, when applied to a complex real world problem, it allows us to generalize soft computing methods to find the number and characteristics of a set of clusters. We contrast the characteristics of the clusters gotten from the automated method with those of the experts.
more …
By
Pelczer, Ildikó; Rodríguez, Fernando Gamboa
We describe the design of a computer system for problem field generation in mathematics. The concept of problem field refers to a sequence of related problems. The relationship between the problems assures that their successive solution promotes deep learning of the topic (sequence problems). We define four relations between the field’s problems: difficulty and rule, question or property preservation. The problem’s difficulty is given by problemtype, algebraic complexity and difficulties of the problem’s question and rule used for solution. A problem field based on difficulty will modify one of the above elements in order to get new problems from existing ones. In rule and propertypreserving generation the field’s problems differ in their type or difficulty level. In question preserving generation the problems maintain the initial problem’s unknown. Once the details of the change are established the new problem will be generated using templates, rules and examples.
more …
By
Lozano, Alexis; Mireles, Víctor; Monsivais, Daniel; Stephens, Christopher R.; Alcalá, Sergio Antonio; Cervantes, Francisco
Show all (6)
The concept of “building block” has played an important role in science and also at a more formal level in the notion of search, especially in the context of Evolutionary Computation. However, there is still a great deal that is not understood about why, or even if, building blocks help in search. In this paper we introduce an elementary search problem and a class of search algorithms that use building blocks. We consider how the use of building blocks affects the efficiency of the search and moreover how the characteristics of the building blocks  the number of types and their sizes  greatly influences the search.
more …
By
Andrade, Miguel Ángel Gutiérrez; García, Eric Alfredo Rincón
2 Citations
The design of electoral zones is a complex problem in which democracy of the electoral processes is promoted by some constraints such as population balance, contiguity and compactness. In fact, the computational complexity of zone design problems has been shown to be NPHard. This paper propose the use of a new measure of compactness, which uses a mesh formed with square cells to measure the quality of the electoral zones. Finally, a practical real case was chosen, which topographical settings causes some traditional measures of compactness to give very poor quality results, and was designed an algorithm based on simulated annealing that realizes a search in the space of feasible solutions. The results show that the new measure favors the creation of zones with straight forms and avoids twisted or dispersed figures, without an important effect to the population balance, which are considered zones of high quality.
more …
By
Bagnoli, Franco; Yacoubi, Samira; Rechtman, Raúl
3 Citations
We study the problem of targeted synchronization of stable chaotic extended systems, i.e., systems which are not chaotic in the usual sense, but are unpredictable for finite perturbations. Examples are cellular automata, which are completely discrete dynamical systems. We show that the usual approach may lead to counter intuitive results, but that it is possible to exploit the characteristics of the system in order to reduce the distance between two replicas with less control.
more …
By
Sierra, Gerardo; Lázaro, Jorge; BelEnguix, Gemma
The paper presents LEXIK, an intelligent terminological architecture that is able to efficiently obtain specialized lexical resources for elaborating dictionaries and providing lexical support for different expert tasks. LEXIK is designed as a powerful tool to create a rich knowledge base for lexicography. It will process big amounts of data in a modular system, that combines several applications and techniques for terminology extraction, definition generation, example extraction and term banks, that have been partially developed so far. Such integration is a challenge for the area, which lacks an integrated system for extracting and defining terms from a nonpreprocessed corpus.
more …
By
Martínez Nava, Xóchitl
In this paper it will be shown how dyslexia is a factor that makes learning logic more difficult. It will also provide some methods so the logic teacher can help his dyslexic student to improve his learning process. The text is directed mainly to teachers, but not exclusively, that is to say, we do not deny the possibility that a student with such a disorder can find answers on how to improve the way he learns.
more …
By
Nava, Rodrigo; EscalanteRamírez, Boris; Cristóbal, Gabriel
7 Citations
Since Daugman found out that the properties of Gabor filters match the early psychophysical features of simple receptive fields of the Human Visual System (HVS), they have been widely used to extract texture information from images for retrieval of image data. However, Gabor filters have not zero mean, which produces a nonuniform coverage of the Fourier domain. This distortion causes fairly poor pattern retrieval accuracy. To address this issue, we propose a simple yet efficient image retrieval approach based on a novel logGabor filter scheme. We make emphasis on the filter design to preserve the relationship with receptive fields and take advantage of their strong orientation selectivity. We provide an experimental evaluation of both Gabor and logGabor features using two metrics, the KullbackLeibler (D_{KL}) and the JensenShannon divergence (D_{JS}). The experiments with the USCSIPI database confirm that our proposal shows better retrieval performance than the classic Gabor features. 3
more …
By
MurguíaRomero, Miguel; VillalobosMolina, Rafael; MéndezCruz, René; JiménezFlores, Rafael
Show all (4)
2 Citations
We studied the variability of obesity in a sample of 4,164 young Mexicans (1724 years old) measured through the waist circumference. According to the American Heart Association, obesity is one of the five clinical alterations to define the Metabolic Syndrome (MS); the other four are low levels of HDL cholesterol, and high values of triglycerides, glucose, and blood pressure. It has been proposed a cutoff point of 80 cm for women and 90 cm for men to define a normal or altered value of waist circumference for Mexicans. We assume that the waist circumference in healthy population has a normal distribution, so a monolithic cutoff point is only an upper limit for normal values. The objective of this work is to estimate the subjacent normal distribution of the waist circumference of healthy people, involving in this analysis the other four components of the MS, and approaching the problem as a combinatory one. We defined a combination of cutoff points for the other four components of the MS; if considering a set of 50 cutoff points candidates for each of the five parameters, then results in a searching space of 50^{5} (more than 300 millions of combinations). Each particular combination of cutoff points (excluding waist circumference) sets a subpopulation in which parameter values fall into normal ranges so defined; then for each subpopulation we calculated the histogram of the waist circumference values. Using a heuristic function involving the symmetry value of the histogram (skewness), we applied a ‘best first search’ on the combination of cutoff points. We found a combination of cutoff point values that generates the more symmetrical histogram, so we propose it as a useful criterion to set cutoff points of MS parameters for Mexicans. Finally, the obtained histogram is proposed as the normal distribution of healthy population, and represents the variability of the waist circumference of nonobese young Mexicans.
more …
By
MoraGutiérrez, Roman Anselmo; RamírezRodríguez, Javier; RincónGarcía, Eric Alfredo
14 Citations
In this paper we propose a new multiagent metaheuristic based in an artificial society that uses a dynamic creative system to compose music, called “Method of musical composition” or MMC. To show the performance of our proposed MMC algorithm, 13 benchmark continuous optimization problems and the related results are compared with harmony search, improved harmony search, globalbest harmony search and selfadaptative harmony search. The experimental results demonstrate that MMC improves the results obtained by the other metaheuristics in a set of multimodal functions.
more …
By
EstivillCastro, Vladimir; Rosenblueth, David A.
3 Citations
We show that recent Modeldriven Engineering that uses sequential finite state models in combination with a common sense logic is subject to efficient model checking. To achieve this, we first provide a formal semantics of the models. Using this semantics and methods for modeling sequential programs we obtain small Kripke structures. When considering the logics, we need to extend this to handle external variables and the possibilities of those variables been affected at any time during the execution of the sequential finite state machine. Thus, we extend the construction of the Kripke structure to this case. As a proof of concept, we use a classical example of modeling a microwave behavior and producing the corresponding software directly from models. The construction of the Kripke structure has been implemented using flex, bison and C++, and properties are verified using NuSMV.
more …
By
Cunha, Iria; SanJuan, Eric; TorresMoreno, JuanManuel; Cabré, M. Teresa; Sierra, Gerardo
Show all (5)
1 Citations
Nowadays automatic discourse analysis is a very prominent research topic, since it is useful to develop several applications, as automatic summarization, automatic translation, information extraction, etc. Rhetorical Structure Theory(RST) is the most employed theory. Nevertheless, there are not many studies about this subject in Spanish. In this paper we present the first system assigning nuclearity and rhetorical relations to intrasentence discourse segments in Spanish texts. To carry out the research, we analyze the learning corpus of the RST Spanish Treebank, a corpus of manuallyannotated specialized texts, in order to build a list of lexical and syntactic patterns marking rhetorical relations. To implement the system, this patterns’ list and a discourse segmenter called DiSeg are used. To evaluate the system, it is applied over the test corpus of the RST Spanish Treebank. Automatic and manual rhetorical analyses of each sentence are compared, by means of recall and precision, obtaining positive results.
more …
By
CruzBastida, JuanPablo; RosadoMéndez, Iván; PérezPonce, Héctor; Villaseñor, Yolanda; Galván, Héctor A.; TrujilloZamudio, Flavio E.; BenítezBribiesca, Luis; Brandan, MaríaEster
Show all (8)
4 Citations
CEDM is a radiological technique based on the use of digital mammography equipment and the injection of an iodinated contrast medium to enhance the visualization of tissues of interest. In previous works, our group has proposed a formalism for the use of dualenergy temporal CEDM, based on weighted subtraction of images, that has been applied with success to phantom data. This methodology requires the selection of ROIs by a radiologist, to determine the weight factors. In this work, we propose an alternative that improves the contrast in clinical images resulting from dualenergy temporal CEDM subtraction, while freeing the method from ambiguities due to the ROI selection by a radiologist. The new subtraction algorithm is based on the use of weight factors calculated pixelbypixel. The main result after evaluation of the methodology on images of 10 patients randomly chosen is a substantial improvement of the contrast (~5 times), reaching values that are similar to those obtained with single energy subtraction.
more …
By
KuriMorales, Angel Fernando; AldanaBobadilla, Edwin; LópezPeña, Ignacio
8 Citations
Genetic Algorithms (GAs) have long been recognized as powerful tools for optimization of complex problems where traditional techniques do not apply. In [1] we reported the superior behavior, out of 4 evolutionary algorithms and a hill climber, of a particular breed: the socalled Eclectic Genetic Algorithm (EGA). EGA was tested vs. a set (TS) consisting of large number of selected problems most of which have been used in previous works as an experimental testbed. However, the conclusions of the said benchmark are restricted to the functions in TS. In this work we extend the previous results to a much larger set (U) consisting of ξ ≈
$\sum\limits^{31}_{i=1}$
(2^{64})^{i} ≈ 11 ×10^{50} unconstrained functions. Randomly selected functions in U were minimized for 800 generations each; the minima were averaged in batches of 36 each yielding
$\overline{X}_i$
for the ith batch. This process was repeated until the
$\overline{X}_i$
’s displayed a Gaussian distribution with parameters
$\mu_{\overline{X}}$
and
$\sigma_{\overline{X}}$
. From these, the parameters μ and σ describing the probabilistic behavior of each of the algorithms for U were calculated with 95% reliability. We give a sketch of the proof of the convergence of an elitist GA to the global optimum of any given function. We describe the methods to: a) Generate the functions; b) Calculate μ and σ for U and c) Evaluate the relative efficiency of all algorithms in our study. EGA’s behavior was the best of all algorithms.
more …
