Showing 1 to 32 of 32 matching Articles
Results per page:
Export (CSV)
By
RodriguezTello, Eduardo; Hao, JinKao; TorresJimenez, Jose
Post to Citeulike
This paper introduces a new evaluation function, called δ , for the Bandwidth Minimization Problem for Graphs (BMPG). Compared with the classical β evaluation function used, our δ function is much more discriminating and leads to smoother landscapes. The main characteristics of δ are analyzed and its practical usefulness is assessed within a Simulated Annealing algorithm. Experiments show that thanks to the use of the δ function, we are able to improve on some previous best results of a set of wellknown benchmarks.
more …
By
RodriguezTello, Eduardo; TorresJimenez, Jose
Post to Citeulike
1 Citations
Genetic algorithms have been successfully applied to many difficult problems but there have been some disappointing results as well. In these cases the choice of the internal representation and genetic operators greatly conditions the result.
In this paper a GA and a reordering algorithm were used for solve SAT instances. The reordering algorithm produces a more suitable encoding for a GA that enables a GA performance improvement. The attained improvement relies on the buildingblock hypothesis, which states that a GA works well when short, loworder, highlyfit schemata (building blocks) recombine to form even more highly fit higherorder schemata. The reordering algorithm delivers a representation which has the most related bits (i.e. Boolean variables) in closer positions inside the chromosome.
The results of experimentation demonstrated that the proposed approach improves the performance of a simple GA in all the tests accomplished. These experiments also allow us to observe the relation among the internal representation, the genetic operators and the performance of a GA.
more …
By
PazRamos, Marco Antonio; TorresJimenez, Jose; QuinteroMarmolMarquez, Enrique; EstradaEsquivel, Hugo
Show all (4)
Post to Citeulike
2 Citations
During the last years the use of intelligent strategies for tuning ProportionalIntegralDerivative (PID) controllers has been growing. The evolutionary strategies have won an important place thanks to their flexibility. In this paper, the automatic tuning of systems with stable and unstable dynamics, through a genetic approach is presented. The advantages of the proposed approach ere highlighted through the comparison with the ZieglerNichols modified closed loop method, and the Visioli genetic approach. The proposed methodology goal is to expand the intelligent tuning application to a wider range of processes (covering systems with oscillatory or unstable modes).
more …
By
GonzalezHernandez, Loreto; RangelValdez, Nelson; TorresJimenez, Jose
Post to Citeulike
6 Citations
The development of a new system involves extensive tests on the software functionality in order to identify possible failures. Also, a system already built requires a fine tuning of its configurable options to give the best performance in the environment it is going to work. Both cases require a finite set of tests that avoids testing all the possible combinations (which is time consuming); to this situation Mixed Covering Arrays (MCAs) are a feasible alternative. MCAs are combinatorial structures represented as matrices having a test case per row. MCAs are small, in comparison with brute force, and guarantees a level of interaction among the parameters involved (a difference with random testing). We present a Tabu Search (TS) algorithm to construct MCAs; the novelty in the algorithm is a mixture of three neighborhood functions. We also present a new benchmark for the MCAs problem. The experimental evidence showed that the TS algorithm improves the results obtained by other approaches reported in the literature, finding the optimal solution in some the solved cases.
more …
By
JoseGarcia, Adan; RomeroMonsivais, Hillel; HernandezMorales, Cindy G.; RodriguezCristerna, Arturo; RiveraIslas, Ivan; TorresJimenez, Jose
Show all (6)
Post to Citeulike
2 Citations
Cryptosystems require the computation of modular exponentiation, this operation is related to the problem of finding a minimal addition chain. However, obtaining the shortest addition chain of length n is an NPComplete problem whose search space size is proportional to n!. This paper introduces a novel idea to compute the minimal addition chain problem, through an implementation of a Simulated Annealing (SA) algorithm. The representation used in our SA is based on Factorial Number System (FNS). We use a finetuning process to get the best performance of SA using a Covering Array (CA), Diophantine Equation solutions (DE) and Neighborhood Functions (NF). We present a parallel implementation to execute the finetuning process using a Message Passing Interface (MPI) and the Single Program Multiple Data (SPMD) model. These features, allowed us to calculate minimal addition chains for benchmarks considered difficult in very short time, the experimental results show that this approach is a viable alternative to solve the solution of the minimal addition chain problem.
more …
By
AvilaGeorge, Himer; TorresJimenez, Jose; RangelValdez, Nelson; Carrión, Abel; Hernández, Vicente
Show all (5)
Post to Citeulike
7 Citations
The Covering Arrays (CAs) are mathematical objects with minimal coverage and maximum cardinality that are a good tool for the design of experiments. A covering array is an N×k matrix over an alphabet v s.t. each N×k subset contains at least one time each combination from {0,1,…,v−1}^{t}, given a positive integer value t. The process of ensuring that a CA contains each of the v^{t} combinations is called verification of CA. In this paper, we present an algorithm for CA verification and its implementation details in three different computation paradigms: (a) sequential approach (SA); (b) parallel approach (PA); and (c) Grid approach (GA). Four different PAs were compared in their performance of verifying a matrix as a CA; the PA with the best performance was included in a different experimentation where the three paradigms, SA, PA, and GA were compared in a benchmark composed by 45 possible CA instances. The results showed the limitations of the different paradigms when solving the verification of CA problem, and points out the necessity of a Grid approach to solve the problem when the size of a CA grows.
more …
By
TorresJimenez, Jose; Alfonso, Carlos; Hernández, Vicente
Post to Citeulike
3 Citations
Grid technology emerged (mainly) in response to the need of making efficient use of underutilized computer resources, and the availability of many commercial and freeware grid management software is making a reality the dream of having huge distributed grid computing at reasonable costs. In this paper a brief introduction to the concept of grid computing is presented, and in order to evidence the usefulness of the grid computing approach, it was applied to compute instances of a hard NPComplete problem, namely ternary covering arrays (CA) computation, using a mutation selection algorithm that ran using InnerGRID over a UPV’s computer cluster.
Topics: Cluster and Grid Computing.
more …
By
RodriguezTello, Eduardo; Hao, JinKao; TorresJimenez, Jose
Post to Citeulike
2 Citations
This paper presents a new Memetic Algorithm designed to compute near optimal solutions for the MinLA problem. It incorporates a highly specialized crossover operator, a fast MinLA heuristic used to create the initial population and a local search operator based on a fine tuned Simulated Annealing algorithm. Its performance is investigated through extensive experimentation over well known benchmarks and compared with other stateoftheart algorithms.
more …
By
PazRamos, Marco Antonio; TorresJimenez, Jose; QuinteroMarmolMarquez, Enrique
Post to Citeulike
1 Citations
During the last years the use of intelligent strategies for tuning ProportionalIntegralDerivative (PID) controllers has been growing. The evolutionary strategies have won an important place thanks to their flexibility. In this paper, the automatic tuning of systems with scarce initial information and integrative and unstable dynamics, through a genetic approach is presented. The advantages of the proposed approach were highlighted through the comparison with the ZieglerNichols modified closed loop method, and the Visioli genetic approach. The proposed methodology goal is to expand the intelligent tuning application to a wider range of processes (covering systems with oscillatory or unstable modes).
more …
By
Estrada, Hugo; Pastor, Oscar; Martínez, Alicia; TorresJimenez, Jose
Show all (4)
Post to Citeulike
2 Citations
At present, the organizational requirements are considered to be one of the most important aspects in the development of information systems. Many research efforts in software engineering have focused on integrating organizational modeling as a key piece in requirements engineering. However, the majority of these works focus only on the definition of notations that permit the representation of the semantics of the organizational context, and only a few works define processes that use these notations in a methodological way. This lack of a methodological process for generating organizational models makes practical application in software development enterprises difficult. The objective of this paper is to present a goalbased method to obtain and refine organizational requirements. These requirements are used to validate the understanding of the organizational process before constructing the information system. This will enable us to develop information systems that integrate the necessary functionality so that the organizational actors perform their tasks and fulfill their goals.
more …
By
RodriguezTello, Eduardo; Hao, JinKao; TorresJimenez, Jose
Post to Citeulike
This paper introduces a refined evaluation function, called Φ, for the Minimum Linear Arrangement problem (MinLA). Compared with the classical evaluation function (LA), Φ integrates additional information contained in an arrangement to distinguish arrangements with the same LA value. The main characteristics of Φ are analyzed and its practical usefulness is assessed within both a Steepest Descent (SD) algorithm and a Memetic Algorithm (MA). Experiments show that the use of Φ allows to boost the performance of SD and MA, leading to the improvement on some previous best known solutions.
more …
By
OrtegaSanchez, Cesar; TorresJimenez, Jose; MoralesCruz, Jorge
Post to Citeulike
1 Citations
This paper presents a genetic algorithm (GA) that solves the problem of routing a multiplexer network into a MUXTREE embryonic array. The procedure to translate the multiplexer network into a form suitable for the GAbased router is explained. The genetic algorithm works on a population of configuration registers (genome) that define the functionality and connectivity of the array. Fitness of each individual is evaluated and those closer to solving the required routing are selected for the next generation. A matrixbased method to evaluate the routing defined by each individual is also explained. The output of the genetic router is a VHDL program describing a lookup table that receives the cell coordinates as inputs and returns the value of the corresponding configuration register. The routing of a module10 counter is presented as an example of the capabilities of the genetic router. The genetic algorithm approach provides not one, but multiple solutions to the routing problem, opening the road to a new level of redundancy where a new “genome” can be downloaded to the array when the conventional reconfiguration strategy runs out of spare cells.
more …
By
MartinezPena, Jorge; TorresJimenez, Jose; RangelValdez, Nelson; AvilaGeorge, Himer
Show all (4)
Post to Citeulike
3 Citations
This paper presents a simulated annealing (SA) algorithm for the construction of ternary covering arrays (CAs) using a trinomial coefficient representation. A ternary CA, denoted by CA(t,k,3), is an N ×k array where each N ×t subarray contains each of the 3^{t} combinations of symbols at least once. The construction of optimal CAs is, in general, an NPcomplete problem. Many reported SA implementations use an N ×k matrix representation for the CA construction. Instead of this, we represent ternary CAs using trinomial coefficients in order to reduce the search space for the SA algorithm.
more …
By
RodriguezTello, Eduardo; TorresJimenez, Jose
Post to Citeulike
3 Citations
This paper presents a new Memetic Algorithm (MA) designed to compute nearoptimal solutions for the covering array construction problem. It incorporates several distinguished features including an efficient heuristic to generate a good quality initial population, and a local search operator based on a fine tuned Simulated Annealing (SA) algorithm employing a carefully designed compound neighborhood. Its performance is investigated through extensive experimentation over well known benchmarks and compared with other stateoftheart algorithms, showing improvements on some previous bestknown results.
more …
By
RangelValdez, Nelson; TorresJimenez, Jose
Post to Citeulike
It is known that some NPComplete problems exhibit sharp phase transitions with respect to some order parameter. Moreover, a correlation between that critical behavior and the hardness of finding a solution exists in some of these problems. This paper shows experimental evidence about the existence of a critical behavior in the computational cost of solving the bandwidth minimization problem for graphs (BMPG). The experimental design involved the density of a graph as order parameter, 200000 random connected graphs of size 16 to 25 nodes, and a branch and bound algorithm taken from the literature. The results reveal a bimodal phase transition in the computational cost of solving the BMPG instances. This behavior was confirmed with the results obtained by metaheuristics that solve a known BMPG benchmark.
more …
By
RodriguezTello, Eduardo; Hao, JinKao; TorresJimenez, Jose
Post to Citeulike
In this paper the Minimum Linear Arrangement (MinLA) problem is studied within the framework of memetic algorithms (MA). A new dedicated recombination operator called Trajectory Crossover (TX) is introduced and its performance is compared with four previous crossover operators. It is shown that the TX crossover induces a better population diversity. The MA using TX is evaluated on a set of wellknown benchmark instances and is compared with several stateofart MinLA algorithms.
more …
By
BrachoRios, Josue; TorresJimenez, Jose; RodriguezTello, Eduardo
Post to Citeulike
5 Citations
A Covering Array denoted by CA(N;t,k,v) is a matrix of size N ×k, in which each of the v^{t} combinations appears at least once in every t columns. Covering Arrays (CAs) are combinatorial objects used in software testing. There are different methods to construct CAs, but as it is a highly combinatorial problem, few complete algorithms to construct CAs have been reported. In this paper a new backtracking algorithm based on the Branch & Bound technique is presented. It searches only nonisomorphic Covering Arrays to reduce the search space of the problem of constructing them. The results obtained with this algorithm are able to match some of the best known solutions for small instances of binary CAs.
more …
By
TorresJimenez, Jose; RodriguezTello, Eduardo
Post to Citeulike
1 Citations
The Bandwidth Minimization Problem for Graphs (BMPG) can be defined as finding a labeling for the vertices of a graph, where the maximum absolute difference between labels of each pair of connected vertices is minimum. The most used measure for the BMPG algorithms isβ, that indicates only the maximum of all absolute differences.
After analyzing some drawbacks of β, a measure, calledγ, which uses a positional numerical system with variable base and takes into account all the absolute differences of a graph is given.
In order to test the performance of γ and β a stochastic search procedure based on a Simulated Annealing (SA) algorithm has been applied to solve the BMPG. The experiments show that the SA that uses γ has better results for many classes of graphs than the one that uses β.
more …
By
TorresJimenez, Jose; AvilaGeorge, Himer; RangelValdez, Nelson; MartinezPena, Jorge
Show all (4)
Post to Citeulike
This paper presents a novel use of SQL language to solve a practical optimization problem to find the portfolio size and the quantity of money for securities. This problem is known as the Portfolio Selection Problem (PSP). The approach was tested on 9 random instances of PSP. Each instance has up to 12 securities and 50 different choices of money. Each security follows a nonlinear profit model. The limitations of our approach are bounded by the computer resources, given that potentially SQL constructs the Cartesian product of the investment options, but it has the advantages of not requiring complex structures and it is easy to understand.
more …
By
GonzalezHernandez, Loreto; TorresJimenez, Jose; RangelValdez, Nelson; BrachoRios, Josue
Show all (4)
Post to Citeulike
The development of a new software system involves extensive tests on the software functionality in order to identify possible failures. It will be ideal to test all possible input cases (configurations), but the exhaustive approach usually demands too large cost and time. The test suite reduction problem can be defined as the task of generating small set of test cases under certain requirements. A way to design test suites is through interaction testing using a matrix called Covering Array, CA(N;t,k,v), which guarantees that all configurations among every t parameters are covered. This paper presents a simple strategy that reduces the number of rows of a CA. The algorithms represent a postoptimization process which detects wild cards (values that can be changed arbitrarily without the CA losses its degree of coverage) and uses them to merge rows. In the experiment, 667 CAs, created by a stateoftheart algorithm, were subject to the reduction process. The results report a reduction in the size of 347 CAs (52% of the cases). As part of these results, we report the matrix for CA(42;2,8,6) constructed from CA(57;2,8,6) with an impressive reduction of 15 rows, which is the best upper bound so far.
more …
By
AvilaGeorge, Himer; TorresJimenez, Jose; Hernández, Vicente
Post to Citeulike
In software systems, a common source of bugs are unexpected interactions among systems components. This risk is increased when the number of software components increases. To reduce this risk and ensure software quality, it may be necessary to test all interactions among the components. Combinatorial testing is a method that can reduce cost and increase the effectiveness of software testing for many applications. Covering arrays are combinatorial structures which can be used to represent testsuites. This paper presents a metaheuristic approach based on a simulated annealing algorithm for constructing covering arrays. The experimental design solved a benchmark reported in the literature and it is proposed a new bechkmark based on real testcases. Experimental evidence showed that the simulated annealing algorithm equals or improves the obtained results by other approaches reported in the literature, finding the optimal solution in some of the solved cases.
more …
By
TorresJimenez, Jose; IzquierdoMarquez, Idelfonso; GonzalezGomez, Aldo; AvilaGeorge, Himer
Show all (4)
Post to Citeulike
1 Citations
Covering arrays are used in testing deterministic systems where failures occur as a result of interactions among subsystems. The goal is to reveal if any interaction induces a failure in the system. Application areas include software and hardware testing. A binary covering array CA(N;t,k,2) is an
$$N \times k$$
array over the alphabet
$$\{0,1\}$$
with the property that each set of t columns contains all the
$$2^t$$
possible ttuples of 0’s and 1’s at least once. In this paper we propose a direct method to construct binary covering arrays using an specific interpretation of binomial coefficients: a binomial coefficient with parameters k and r will be interpreted as the set of all the ktuples from
$$\{0,1\}$$
having r ones and
$$kr$$
zeroes. For given values of k and t, the direct method uses an explicit formula in terms of both k and t to provide a covering array CA(N;t,k,2) expressed as the juxtaposition of a set of binomial coefficients; this covering array will be of the minimum size that can be obtained by any juxtaposition of binomial coefficients. In order to derive the formula, a Branch & Bound (B&B) algorithm was first developed; the B&B algorithm provided solutions for small values of k and t that allowed the identification of the general pattern of the solutions. Like others previously reported methods, our direct method finds optimal covering arrays for
$$k = t+1$$
and
$$k = t+2$$
; however, the major achievement is that nine upper bounds were significantly improved by our direct method, plus the fact that the method is able to set an infinite number of new upper bounds for
$$t \ge 7$$
given that little work has been done to compute binary covering arrays for general values of k and t.
more …
By
GonzalezHernandez, Loreto; TorresJimenez, Jose
Post to Citeulike
Software systems have been increasingly used by our society, so a failure in them can lead to large losses. To reduce the failures of a software it is necessary to carry out the testing process appropriately. The combinatorial testing helps in the testing process by providing structures with a test set of small size, like Mixed Covering Arrays (MCAs). However, the problem of constructing an optimal test set is an NPcomplete problem leading to the development of non exhaustive approaches to solve it. This paper proposes a new approach of Tabu Search (TS) called MiTS (that stands for Mixed Tabu Search) which focuses on constructing MCAs. The approach is based on the use of a mixture of neighborhood functions and a fine tuning process to improve the performance of the TS. Experimental evidence shows a poor performance when a single neighborhood function is used. In the other side, the TS (using a mixture of neighborhood functions) is competitive in the construction of MCAs over a known benchmark reported in the literature.
more …
By
Ordoñez, Hugo; TorresJimenez, Jose; Ordoñez, Armando; Cobos, Carlos
Show all (4)
Post to Citeulike
Due to the large volume of business process repositories, manually finding a particular process or a subset of them based on similarities in functionality or activities may become a difficult task. This paper presents a search method and a cluster method for business processes models. The search method considers linguistic and behavior information. The cluster method is based on covering arrays (a combinatorial object used to minimize the set of trials to find a particular structure). The cluster method avoids overlapping and improves the homogeneity of the groups created using a covering array. Obtained results outperform the precision, recall and FMeasure of previously reported approaches.
more …
By
Timaná, Jimena; Cobos, Carlos; TorresJimenez, Jose
Post to Citeulike
Covering Arrays (CA) are mathematical objects widely used in the design of experiments in several areas of knowledge and of most recent application in hardware and software testing. CA construction is a complex task that entails a high run time and high computational load. To date, research has been carried out for constructing optimal CAs using exact methods, algebraic methods, Greedy methods, and metaheuristicbased methods. These latter, including among them Simulated Annealing and Tabu Search, have reported the best results in the literature. Their effectiveness is largely due to the use of local optimization techniques with different neighborhood schemes. Given the excellent results of Globalbest Harmony Search (GHS) algorithm in various optimization problems and given that it has not been explored in CA construction, this paper presents a memetic algorithm (GHSSA) using GHS for global search, SA for local search and two neighborhood schemes for the construction of uniform and mixed CAs of different strengths. GHSSA achieved competitive results on comparison with the state of the art and in experimentation did not require the use of supercomputers.
more …
By
LopezEscogido, Daniel; TorresJimenez, Jose; RodriguezTello, Eduardo; RangelValdez, Nelson
Show all (4)
Post to Citeulike
11 Citations
According to the NIST report of 2002 there is a great potential to reduce the cost, and to increase the quality of the software developed in USA through the creation of automated tools that help in the software testing process. One alternative to improve the software testing process is the creation of tools that generate testing cases in an automatic way. Through the construction of Covering Arrays (CA) it is possible to obtain a minimum set of test cases with the maximum possibility of testing all the functionality of the developed software.
In this paper an approach to construct CA using the propositional satisfiability problem (SAT) is presented. The approach starts with the transformation of a CA instance into a nonconjunctive normal form (nonCNF) SAT instance. Then the SAT instance is solved through a nonCNF SAT solver, and finally the SAT solution is transformed into a CA solution. The main contributions of this work are: an efficient nonCNF SAT solver able to equals or improves previously reported results, and a simplified SAT representation to solve the CA problem.
more …
By
Ruano, Edgar; Cobos, Carlos; TorresJimenez, Jose
Post to Citeulike
1 Citations
The rise of Bus Rapid Transit Systems (BRTS) in urban centers involves complex problems of design and scheduling including the scheduling of route intervals across the bus network. The difficulty stems from the fact that transport systems keep to established routes and must set frequencies for each route to minimize costs (measured in terms of transport capacity wasted) and maximize the quality of service (minimizing the total time of users in the system). All this depends on the maximum number of buses available in the system. In an effort to find an alternative solution to the Transit Network Frequencies Setting Problem (TNFSP) on BRTS, this paper proposes using MultiObjective Global Best Harmony Search (MOGBHS), a multiobjective heuristic algorithm based on three main components: (1) GlobalBest Harmony Search, as a heuristic optimization strategy, (2) NonDominated Sorting, as a multiobjective optimization strategy, and (3) Discrete Event Simulation, for obtaining quality measures in the solutions found. To test the proposed approach, a simulation model was implemented for Megabus, a BRTS located in Pereira (Colombia), for which the frequency of the buses assigned to routes previously defined in the system was optimized so that operating costs were reduced to a minimum, while user satisfaction was maximized. The MOGBHS algorithm was compared with NSGAII. It was concluded that MOGBHS outperformed NSGAII in the number of optimal solutions found (Pareto front points), from 175% in 3,000 fitness function evaluations to 488% in 27,000 evaluations.
more …
By
Dorado, Hugo; Cobos, Carlos; TorresJimenez, Jose; Jimenez, Daniel; Mendoza, Martha
Show all (5)
Post to Citeulike
The methods for variable importance measures and feature selection in the task of classification/regression in data mining and Big Data enable the removal of noise caused by irrelevant or redundant variables, the reduction of computational cost in the construction of models and facilitate the understanding of these models. This paper presents a proposal to measure the importance of the input variables in a classification/regression problem, taking as input the solutions evaluated by a wrapper and the performance information (quality of classification expressed for example in accuracy, precision, recall, F measure, among others) associated with each of these solutions. The proposed method quantifies the effect on the classification/regression performance produced by the presence or absence of each input variable in the subsets evaluated by the wrapper. This measure has the advantage of being specific for each classifier, which makes it possible to differentiate the effects each input variable can generate depending on the model built. The proposed method was evaluated using the results of three wrappers  one based on genetic algorithms (GA), another on particle swarm optimization (PSO), and a new proposal based on covering arrays (CA)  and compared with two filters and the variable importance in Random Forest. The experiments were performed on three classifiers (Naive Bayes, Random Forest and MultiLayer Perception) and seven data sets from the UCI repository. The comparisons were made using Friedman’s Aligned Ranks test and the results indicate that the proposed measure stands out for maintaining in the first input variables a higher quality in the classification, approximating better to the variables found by the feature selection methods.
more …
By
TorresJimenez, Jose; Cruz, Laura; RangelValdez, Nelson
Post to Citeulike
Summary
Artificial Intelligence (AI) is the core of many current technologies with diverse applications like: a) systems that understand the spoken languages; b) intelligent tutors that assist in the process of learning new concepts; c) systems that detect patterns in huge amounts of data; etc. Also AI has originated many spinoff technologies that are seen as part of our daily lives, v.gr. a) the mouse; b) symbolic programming languages; c) symbolic computation systems like Macsyma. This work is related to the field of symbolic computation, specifically we present an optimized algorithm that is able to compute symbolic summation of polynomials. The algorithm is based on the solution of a system of simultaneous equations that delivers the coefficients of the resulting polynomial.
more …
By
TorresJimenez, Jose; IzquierdoMarquez, Idelfonso
Post to Citeulike
A covering array
$$\text{ CA }(N;t,k,v)$$
is an
$$N\times k$$
array such that in every
$$N\times t$$
subarray each possible ttuple over a vset appears as a row of the subarray at least once. The integers t and v are respectively the strength and the order of the covering array. Let v be a prime power and let
$${\mathbb {F}}_v$$
denote the finite field with v elements. In this work the original concept of permutation vectors generated by a
$$(t1)$$
tuple over
$${\mathbb {F}}_v$$
is extended to vectors generated by a ttuple over
$${\mathbb {F}}_v$$
. We call these last vectors extended permutation vectors (EPVs). For every prime power v, a covering perfect hash family
$$\text{ CPHF }(2;v^2v+3,v^3,3)$$
is constructed from EPVs given by subintervals of a linear feedback shift register sequence over
$${\mathbb {F}}_v$$
. When
$$v\in \{7,9,11,13,16,17,19,23,25\}$$
the covering array
$$\text{ CA }(2v^3v;3,v^2v+3,v)$$
generated by
$$\text{ CPHF }(2;v^2v+3,v^3,3)$$
has less rows than the bestknown covering array with strength three,
$$v^2v+3$$
columns, and order v. CPHFs formed by EPVs are also constructed using simulated annealing; in this case the results improve the size of eighteen covering arrays of strength three.
more …
By
TorresJimenez, Jose; AvilaGeorge, Himer
Post to Citeulike
1 Citations
Searchbased software engineering is the application of optimization techniques in solving software engineering problems. One challenge to testing software systems is the effort involved in creating test suites that will systematically test the system and reveal faults in an effective manner. Given the importance of the software testing phase, a specific subarea called searchbased software testing has become increasingly important. This paper presents a searchbased software testing tool (SBSTT), for constructing test suites. Through the use of SBSTT we were able to find 370 new upper bounds for binary test suites.
more …
