Showing 1 to 100 of 276 matching Articles
Results per page:
Export (CSV)
By
Phillips, John D.
Post to Citeulike
3 Citations
The paper addresses the problem of generating sentences from logical formulae. It describes a simple and efficient algorithm for generating text which has been developed for use in machine translation, but will have wider application in natural language processing. An important property of the algorithm is that the logical form used to generate a sentence need not be one which could have been produced by parsing the sentence: formal equivalence between logical forms is allowed for. This is necessary for a machine translation system, such as the one envisaged in this paper, which uses single declarative grammars of individual languages, and declarative statements of translation equivalences for transfer. In such a system, it cannot be guaranteed that transfer will produce a logical form in the same order as would have been produced by parsing some targetlanguage sentence, and it is not practicable to define a normal form for the logical forms. The algorithm is demonstrated using a categorial grammar and a simple indexed logic, as this allows a particularly clear and elegant formulation. It is shown that the algorithm can be adapted to phrasestructure grammars, and to more complex semantic representations than that used here.
more …
By
Church, Kenneth W.; Hovy, Eduard H.
Post to Citeulike
17 Citations
Ideally, we might hope to improve the performance of our MT systems by improving the system, but it might be even more important to improve performance by looking for a more appropriate application. A survey of the literature on evaluation of MT systems seems to suggest that the success of the evaluation often depends very strongly on the selection of an appropriate application. If the application is wellchosen, then it often becomes fairly clear how the system should be evaluated. Moreover, the evaluation is likely to make the system look good. Conversely, if the application is not clearly identified (or worse, if the application is poorly chosen), then it is often very difficult to find a satisfying evaluation paradigm. We begin our discussion with a brief review of some evaluation metrics that have been tried in the past and conclude that it is difficult to identify a satisfying evaluation paradigm that will make sense over all possible applications. It is probably wise to identify the application first, and then we will be in a much better position to address evaluation questions. The discussion will then turn to the main point, an essay on how to pick a good niche application for stateoftheart (crummy) machine translation.
more …
By
Reyes, Gonzalo E.; Zawadowski, Marek W.
Post to Citeulike
3 Citations
In the paper [8], the first author developped a topos theoretic approach to reference and modality. (See also [5]). This approach leads naturally to modal operators on locales (or ‘spaces without points”). The aim of this paper is to develop the theory of such modal operators in the context of the theory of locales, to axiomatize the propositional modal logics arising in this context and to study completeness and decidability of the resulting systems.
more …
By
Hugly, Philip; Sayward, Charles
Post to Citeulike
The fact that a group of axioms use the word ‘true’ does not guarantee that that group of axioms yields a theory of truth. For Davidson the derivability of certain biconditionals from the axioms is what guarantees this. We argue that the test does not work. In particular, we argue that if the object language has truthvalue gaps, the result of applying Davidson's definition of a theory of truth is that no correct theory of truth for the language is possible.
more …
By
Lauth, Bernhard
Post to Citeulike
3 Citations
The paper investigates learning functions for first order languages. Several types of convergence and identification in the limit are defined. Positive and negative results on learning problems are presented throughout the paper.
By
Roy, Dev K.
Post to Citeulike
The properties of antisymmetry and linearity are easily seen to be sufficient for a recursively enumerable binary relation to be recursively isomorphic to a recursive relation. Removing either condition allows for the existence of a structure where no recursive isomorph exists, and natural examples of such structures are surveyed.
more …
By
Gentilini, Paolo
Post to Citeulike
1 Citations
In this paper the PAcompleteness of modal logic is studied by syntactical and constructive methods. The main results are theorems on the structure of the PAproofs of suitable arithmetical interpretationsS^{ϕ} of a modal sequentS, which allow the transformation of PAproofs ofS^{ϕ} into prooftrees similar to modal prooftrees. As an application of such theorems, a proof of Solovay's theorem on arithmetical completeness of the modal system G is presented for the class of modal sequents of Boolean combinations of formulas of the form □p_{i},m_{i}=0, 1, 2, ... The paper is the preliminary step for a forthcoming global syntactical resolution of the PAcompleteness problem for modal logic.
more …
By
Brady, Ross T.
Post to Citeulike
1 Citations
This paper surveys the various forms of Deduction Theorem for a broad range of relevant logics. The logics range from the basic system B of RoutleyMeyer through to the system R of relevant implication, and the forms of Deduction Theorem are characterized by the various formula representations of rules that are either unrestricted or restricted in certain ways. The formula representations cover the iterated form,A_{1} → .A_{2} → . ... .A_{n} →B, the conjunctive form,A_{1}&A_{2} & ...A_{n} →B, the combined conjunctive and iterated form, enthymematic version of these three forms, and the classical implicational form,A_{1}&A_{2}& ...A_{n} ⊃B. The concept of general enthymeme is introduced and the Deduction Theorem is shown to apply for rules essentially derived using Modus Ponens and Adjunction only, with logics containing either (A →B)&(B →C) → .A →C orA →B → .B →C→ .A →C.
more …
By
Balzer, Wolfgang; Lauth, Bernhard; Zoubek, Gerhard
Post to Citeulike
4 Citations
A comprehensive model for describing various forms of developments in science is defined in precise, settheoretic terms, and in the spirit of the structuralist approach in the philosophy of science. The model emends previous accounts in centering on single systems in a homogenous way, eliminating notions which essentially refer to sets of systems. This is achieved by eliminating the distinction between theoretical and nontheoretical terms as a primitive, and by introducing the notion of intended links. The force of the model is demonstrated by formally incorporating many of the important, precise metatheoretic concepts occurring in the literature.
more …
By
Lascarides, Alex; Asher, Nicholas
Post to Citeulike
96 Citations
This paper presents a formal account of how to determine the discourse relations between propositions introduced in a text, and the relations between the events they describe. The distinct natural interpretations of texts with similar syntax are explained in terms of defeasible rules. These characterise the effects of causal knowledge and knowledge of language use on interpretation. Patterns of defeasible entailment that are supported by the logic in which the theory is expressed are shown to underly temporal interpretation.
more …
By
Hand, Michael
Post to Citeulike
10 Citations
Conclusion
I have proposed that the complementizerthat has a pragmatic property of demonstrativity, analogous to that ascribed by demonstrative analyses of the semantics of the complementizer but not impinging on the syntactic analysis of sentential embedding. My account explains a number of phenomena, including the illocutionary peculiarities of parentheticals, the pragmatics ofthatomission, and consequently the distributional statistics ofthatomission and related grammatical features of embeddings reported in the literature. By this means these phenomena are theoretically unified under a single hypothesis.
Furthermore, this demonstrativity is a matter of degree. There is a spectrum of distinct pragmatic manifestations of this demonstrativity, ranging from the purely paratacticlike interpretation that ascribes no illocutionary relation between the speaker and the complement, and is highly incompatible withthatomission, to the purely parenthetical interpretation where illocutionary force attaches to the complement and is highly conducive tothatomission.
A more general moral appears when the minimally revisionist syntactic consequences of my proposal are compared to the radical syntactic consequences of Thompson and Mulac's. Even if we hold that the syntactic structure of a natural language cannot be grasped outside of its general communicative contexts, we need not connect syntax and pragmatics so immediately as Thompson and Mulac seem to think. By resisting the idea that the syntax of parentheticals isipso facto different from the syntax of compositional embeddings, we allow some “slack” between syntax and pragmatics, thereby enabling us to analyze such subtle syntaxpragmatics interactions as negativeraised parentheticals. The methodological moral is to avoid too facile a connection between syntactic and pragmatic analyses.
more …
By
Farwell, David; Guthrie, Louise; Wilks, Yorick
Post to Citeulike
11 Citations
In this paper, we describe both a multilingual, interlingual MT system (ULTRA) and a method of extracting lexical entries for it automatically from an existing machinereadable dictionary (LDOCE). We believe the latter is original and the former, although not the first interlingual MT System by any means, may be first that is symmetrically multilingual. It translates between English, German, Chinese, Japanese and Spanish and has vocabularies in each language based on about 10,000 word senses.
more …
By
Basili, Roberto; Pazienza, Maria Teresa; Velardi, Paola
Post to Citeulike
5 Citations
The growing availability of large online corpora encourages the study of word behaviour directly from accessible raw texts. However, the methods by which lexical knowledge should be extracted from plain texts is still a matter of debate and experimentation. In this paper we present an integrated tool for lexical acquisition from corpora, ARIOSTO, based on a hybrid methodology that combines typical NLP techniques, such as (shallow) syntax and semantic markers, with numerical processing. The lexical data extracted by this method, calledclustered association data, are used for a variety of interesting purposes, such as the detection of selectional restrictions, the derivation of syntactic ambiguity criteria and the acquisition of taxonomic relations.
more …
By
Basili, Roberto; Pazienza, Maria Teresa; Velardi, Paola
Post to Citeulike
4 Citations
When implementing computational lexicons it is important to keep in mind the texts that a NLP system must deal with. Words relate to each other in many different, often odd ways this information is rarely found in dictionaries, and it is quite hard to deduce a priori. In this paper we present a technique for the acquisition of statistically significant selectional restrictions from corpora and discuss the results of an experimental application with reference to two specific sublaguages (legal and commercial). We show that there are important cooccurrence preferences among words which cannot be established a priori as they are determined for each choice of sublanguage. The method for detecting cooccurrences is based on the analysis of word associations augmented with syntactic markers and semantic tags. Word pairs are extracted by a morphosyntactic analyzer and clustered according to their semantic tags. A statistical measure is applied to the data to evaluate the sigificance of any relations detected. Selectional restrictions are acquired by a twostep process. First, statistically prevailing ‘coarse grained’ conceptual patterns are used by a linguist to identify the relevant selectional restrictions in sublanguages. Second, semiautomatically acquired ‘coarse’ selectional restrictions are used as the ‘semantic bias’ of a system, ARIOSTO_LEX, for the automatic acquisition of a casebased semantic lexicon.
more …
By
Olsen, Mark
Post to Citeulike
7 Citations
Computeraided literature studies have failed to have a significant impact on the field as a whole. This failure is traced to a concentration on how a text achieves its literary effect by the examination of subtle semantic or grammatical structures in single texts or the works of individual authors. Computer systems have proven to be very poorly suited to such refined analysis of complex language. Adopting such traditional objects of study has tended to discourage researchers from using the tool to ask questions to which it is better adapted, the examination of large amounts of simple linguistic features. Theoreticians such as Barthes, Foucault and Halliday show the importance of determining the linguistic and semantic characteristics of the language used by the author and her/his audience. Current technology, and databases like the TLG or ARTFL, facilitate such widespectrum analyses. Computeraided methods are thus capable of opening up new areas of study, which can potentially transform the way in which literature is studied.
more …
By
Cattaneo, Gianpiero; Dalla Chiara, Maria L.; Giuntini, Roberto
Post to Citeulike
5 Citations
Fuzzy intuitionistic quantum logics (called also BrouwerZadeh logics) represent to non standard version of quantum logic where the connective “not” is split into two different negation: a fuzzylike negation that gives rise to a paraconsistent behavior and an intuitionisticlike negation. A completeness theorem for a particular form of BrouwerZadeh logic (BZL^{3}) is proved. A phisical interpretation of these logics can be constructed in the framework of the unsharp approach to quantum theory.
more …
By
Olsen, Mark
Post to Citeulike
1 Citations
Conclusion
Donald Bruce and I do speak the same theoretical metalanguage, though I suspect that he is considerably more fluent than I in that tongue. Given the variety of responses, including Bruce's favorable reaction, to my attempts to provoke theoretical debate concerning the nature of electronic text as a new object of research, I am considerably more optimistic than his “knowing smile and tears of rage.” It is my contention that researchers in textual computing have significant advantages in reconceptualizing text precisely because computing technology shatters the evident surface structures of text. If electronic text is a radically different object of research, then theoretical models of the kind discussed in this volume should have a significant impact on disciplines which are currently debating the nature and limits of textuality.
I would like to suggest that we, as specialists in textual computing, should make every effort to combine abstract theoretical considerations with clear efforts towards empirical verification. Maintaining that difficult balance between theory, method, and empirical verification is, in my opinion, one of the central contributions that theory of textual computing can make to critical theory in general. The computing environment provides an ideal testing ground for literary theories by encouraging experimentation and verification using real data, an element that is all too often overlooked by many critical theorists.
more …
By
Makinson, David
Post to Citeulike
37 Citations
We discuss similarities and residual differences, within the general semantic framework of minimality, between defeasible inference, belief revision, counterfactual conditionals, updating — and also conditional obligation in deontic logic. Our purpose is not to establish new results, but to bring together existing material to form a clear overall picture.
more …
By
Goldfield, Joel D.
Post to Citeulike
2 Citations
Lack of a critical mass of scholars involved with the computerassisted analysis of texts (CAAT), coupled with insufficient communication among various sectors of the literary and linguistic disciplines, has led to a skewed notion of computing humanists' work among their colleagues. This paper highlights the gap through examples of misunderstood humanist needs and achievements drawn from both recent media reports and humanities conferences. It suggests that networking and less modesty in manuscript submission can be at least partial solutions. The author cites some of his own published work and workinprogress on Stendhal and Gobineau in refuting Mark Olsen's thesis that the dominance of single or dualauthor studies must be the cause of CAAT's “failure” to make significant inroads in mainstream literary journals. The author builds a case for the use of both diachronic and synchronic lexicostatistical data in carrying out such studies successfully. He recommends a new “Synthetic Criticism” where relevant quantitative methods would not be absent.
more …
By
Kolany, Adam
Post to Citeulike
3 Citations
In [4] R.Cowen considers a generalization of the resolution rule for hypergraphs and introduces a notion of satisfiability of families of sets of vertices via 2colorings piercing elements of such families. He shows, for finite hypergraphs with no oneelement edges that if the empty set is a consequence ofA by the resolution rule, thenA is not satisfiable. Alas the converse is true for a restricted class of hypergraphs only, and need not to be true in the general case. In this paper we show that weakening slightly the notion of satisfiability, we get the equivalence of unsatisfiability and the derivability of the empty set for any hypergraph. Moreover, we show the compactness property of hypergraph satisfiability (in the weaker sense) and state its equivalence to BPI, i.e. to the statement that in every Boolean algebra there exists an ultrafilter.
more …
By
Spolsky, Ellen
Post to Citeulike
Olsen is right to note what can be done with a good theory and the right machine. His particular theory, however, is not transferable to literary studies. If we need a new model, I would suggest that cognitive science can provide a few interesting ones. I have begun to do some work based on David Marr's VISION, in which he hypothesizes two levels of processing within the visual module. My speculation has been on the parallel existence of distinguishable levels of conceptual or language organization which would correspond to the viewer and object centered perspectives Marr describes for vision. I propose to explore the possibility that we may find here the model for the existence of stylistic individualism within overarching historical stylistic generalizations, and even more, that this may be what feminists are searching for when they try to resist being coopted by the masculine language of objectivity.
more …
By
Lessard, Greg; Bénard, Johanne
Post to Citeulike
This article uses recent work on the computeraided analysis of texts by the French writer Céline as a framework to discuss Olsen's paper on the current state of computeraided literary analysis. Drawing on analysis of syntactic structures, lexical creativity and use of proper names, it makes two points: (1) given a rich theoretical framework and sufficiently precise models, even simple computer tools such as text editors and concordances can make a valuable contribution to literary scholarship; (2) it is important to view the computer not as a device for finding what we as readers have failed to notice, but rather as a means of focussing more closely on what we have already felt as readers, and of verifying hypotheses we have produced as researchers.
more …
By
Taylor, Dennis
Post to Citeulike
1 Citations
We should follow Mark Olsen's lead and think with maximum ambition of the role of the computer in supporting literary research of the highest order. Thus the computer enables us to answer one of the great questions of literary criticism: how does a given writer contribute to the changing language? We can now chart the influence of given writers by correlating their words and phrasing with computerized dictionaries so as to produce profiles and histories of the way words have entered the language.
more …
By
Freund, Michael
Post to Citeulike
2 Citations
When a proposition α is cumulatively entailed by a finite setA of premisses, there exists, trivially, a finite subsetB ofA such thatB ∪B′ entails α for all finite subsetsB′ that are entailed byA. This property is no longer valid whenA is taken to be an arbitrary infinite set, even when the considered inference operation is supposed to be compact. This leads to a refinement of the classical definition of compactness. We call supracompact the inference operations that satisfy the nonfinitary analogue of the above property. We show that for any arbitrary cumulative operationC, there exists a supracompact cumulative operationK(C) that is smaller thenC and agrees withC on finite sets. Moreover,K(C) inherits most of the properties thatC may enjoy, like monotonicity, distributivity or disjunctive rationality. The main part of the paper concerns distributive supracompact operations. These operations satisfy a simple functional equation, and there exists a representation theorem that provides a semantic characterization for this family of operations. We examine finally the case of rational operations and show that they can be represented by a specific kind of model particularly easy to handle.
more …
By
Bruce, Donald
Post to Citeulike
Humanities computing (HC) has failed to integrate into its practices many of the key theoretical elements of contemporary text and discourse theory. This has in turn contributed to the marginalization of HC in research and teaching. Outdated theoretical models must be abandoned in order to develop a critical discourse based on the insights of HC. HC projects remain far too attached to microanalyses and have not developed the theoretical and methodological tools necessary to undertake systemic macroanalyses on the level of discourse. Given that texts are a mixture of determinate and dynamic systems, recent developments in chaos theory may be of help in modelling the interrelationship of these elements at discourse level.
more …
By
Restall, Greg
Post to Citeulike
18 Citations
A logic is said to becontraction free if the rule fromA → (A →B) toA →B is not truth preserving. It is well known that a logic has to be contraction free for it to support a nontrivial naïve theory of sets or of truth. What is not so well known is that if there isanother contracting implication expressible in the language, the logic still cannot support such a naïve theory. A logic is said to berobustly contraction free if there is no such operator expressible in its language. We show that a large class of finitely valued logics are each not robustly contraction free, and demonstrate that some other contraction free logics fail to be robustly contraction free. Finally, the sublogics of Ł_{ω} (with the standard connectives) are shown to be robustly contraction free.
more …
By
Henry, Charles
Post to Citeulike
Recent articles have noted that humanities computing techniques and methodologies remain marginal to mainstream literary scholarship. Mark Olsen's paper discusses this phenomenon and argues for large scale analyses of text databases that would incorporate a shift in theoretical orientation to include greater stress on intertextuality and sign theory. Part of Olsen's argument revolves on the need to move away from the syntactic and overt grammatical elements of textual language to more subtle semantics and meaning systems. While provocative and important, Olsen's stance remains rooted in literary theoretical constructs. Another level of language, the cognitive, offers equally interesting challenges for humanities computing, though the paradigms for this type of computerbased exploration are derived from disciplines traditionally removed from the humanities. The riddle, a nearly universal genre, offers a window onto some of the cognitive processes involved in deep level language function. By analyzing the riddling process, different methods of computational modelling can be inferred, suggesting new avenues for computing in the humanities.
more …
By
Fortier, Paul A.
Post to Citeulike
1 Citations
Although many scholars in literature currently seem mainly interested in theory, the focus on literary texts is what defines literature studies. Computer technology and the statistical methods it fosters are applicable to both the theoretical and to the interpretative issues which scholars of literature habitually address. Genette's distinction between the homodiegetic and the autodiegetic perspective in firstperson narrative can be confirmed statistically. Roquentin's loneliness inLa nausée can be shown to be a formal characteristic of the type of novel he narrates, thus validating his commentary on his society. The computer can be used to deal with standard literary questions in a principled fashion, and a new orientation of literature studies on a cultural history model, which Mark Olsen recommends, is not necessary.
more …
By
Schweigert, Dietmar
Post to Citeulike
1 Citations
In the paper we present completeness theorems for hybrid logics, discuss the problem of finite axiomatization and study term rewriting and unification for the variety of distributive lattices and the variety of groups of exponent 2.
By
Matsuba, Stephen Naoyuki
Post to Citeulike
This article describes how linguistic analysis can be used to change the computer from being a generator of data that can be used to develop a critical analysis to being a tool that provides a means to explore the processes and inputs that generate different interpretations of literary works. Using Michael Gregory's Communication Linguistics model to analyze Shakespeare's Sonnet CXXX, we see how the linguistic structure of the poem allows for multiple readings of the text. The article goes on to describe how this kind of analysis can be extended by employing artificial intelligence as a means to explore the interaction between language and meaning.
more …
By
Slaney, John
Post to Citeulike
2 Citations
An Ackermann constant is a formula of sentential logic built up from the sentential constant t by closing under connectives. It is known that there are only finitely many nonequivalent Ackermann constants in the relevant logic R. In this paper it is shown that the most natural systems close to R but weaker than itin particular the nondistributive system LR and the modalised system NRallow infinitely many Ackermann constants to be distinguished. The argument in each case proceeds by construction of an algebraic model, infinite in the case of LR and of arbitrary finite size in the case of NR. The search for these models was aided by the computer program MaGIC (Matrix Generator for Implication Connectives) developed by the author at the Australian National University.
more …
By
Greco, Gina L.; Shoemaker, Peter
Post to Citeulike
3 Citations
This paper concurs with Mark Olsen's premise that computeraided literature studies should take a different direction, one that is more suited to the computer's strength in analyzing large corpora of texts. However, the authors take issue with his conclusion that a reorientation of the notions of textual analysis is necessary in order to exploit the computer's capabilities. Contemporary medieval studies already provides us with models of textual analysis which are well suited to computer development. Though they stem from the particularities of medieval textual production, these models can perhaps be useful in the study of modern literatures.
more …
By
Dahl, Veronica; Popowich, Fred; Rochemont, Michael
Post to Citeulike
1 Citations
Parsing according to the principles of modern linguistic theory is only now becoming a computationally interesting task. We contribute to these developments by illustrating how the account of movement introduced by Chomsky inBarriers can be incorporated into a Static Discontinuity Grammar (SDG). We are concerned with A'movement as reflected inwh movement of arguments and adjuncts. The resulting SDG can be processed by an SDG parser to recover the thematic information and constitutency structure associated with a natural language sentence.
more …
By
Clausing, Stephen
Post to Citeulike
2 Citations
Could a troupe of monkeys really produce Shakespeare if allowed to bang away at the word processor long enough? This age old question, commonly referred to as the Eddington problem, relates to fundamental issues of probability, and is examined in this article in a new light. Based on earlier research by the physicist William Bennett, Jr., the author describes a data structure which enables the computer to simulate the hypothetical monkeys. Exploiting principles of cryptology, the computer leads the simulated monkeys closer to their goal. Though the intent of the article is to encourage humanistic speculation, the final result proves to be quite practical and may come as a surprise to computer scientists and humanists alike.
more …
By
Nixon, Paul D.
Post to Citeulike
Involving students in learning a small amount of programming language can enable the teacher to illustrate many of the important concepts of electronic information systems. It introduces them to experiential learning situations involving system design and operation, information handling and the manmachine interface. This paper describes how the authoring language PILOT has been used with arts and humanities undergraduates to increase their understanding of the power and potential of information technology, and to involve them in information problems that relate to their other humanities studies.
more …
By
HendersonSellers, Brian; Cooper, David
Post to Citeulike
6 Citations
It has recently been proposed that classical music has a fractal nature. A reanalysis of this proposal reveals some logical flaws in the argument. Chaos, fractals, time series and Schenkerian analysis are contrasted and interrelated. Further consideration of Bach's Invention No 1 (BWV772) leads to the conclusion that there is no inherent fractal nature in classical music; although the converse is not true. In other words, it is feasible to use fractal ideas to compose musical pieces — an area of much interest in recent years.
more …
By
Frautschi, Richard; Thoiron, Philippe
Post to Citeulike
2 Citations
Although lexical frequencies are familiar measures of stylistic and thematic analysis, only recently have some stylostatisticians been tempted to investigate the relationship between the frequency and topography of repeated lexical items. In the present paper the authors have turned to the study of the four focal types of discursive narratology, using Marguerite Duras'Moderato Cantabile. Their intent is to uncover aspects of narratological performance which further elucidate the communicative strategies in the story. Part 1 summarizes the problematic between frequency and topography. It describes how a topographical index can be computed for any repeated item and how a Global Topography Index (GTI) can summarize the major topographical characteristics of any text sequence. Part 2 presents a fourcell typology of narrational mode: a segmentation of the verbal chain into narrating and narrated speech acts, with each text sequence tagged according to its discursive function: overt sender intervention for story coherence or comment on the focal level of a narrating present; representation of discrete or unlocalized events on the focal level of a mimeticized past. In Part 3 the focal encodings are displayed in numerical and graphic form, first according to the eight surface chapter divisions and then according to twentysix subsets of approximately equal length. The fluctuations of the topography indices are reviewed, with particular attention being paid to the manifestation of cluster effects. Although sender interventions predominate, the relativized behavior of each focal type contributes to a climactic unraveling of the intrigue in the final chapters. In conclusion, the authors stress the dichotomy between the calm surface of the chapters and the agitated tensions of the twentysix subsets.
more …
By
Longeart, Maryvonne; Boss, Gilbert; Skuce, Douglas
Post to Citeulike
This article addresses the methodological problem of the nonlinear representation of philosophical systems in a computerized knowledge base. It is a problem of knowledge representation as defined in the field of artificial intelligence. Instead of a purely theoretical discussion of the issue, we present selected results of a practical experiment which has in itself some theoretical significance. We show how one can represent different philosophies using CODE, a knowledge engineering system developed by artificial intelligence researchers. The hypothesis is that such a computer based representation of philosophical systems can give insight into their conceptual structure. We argue that computer aided text analysis can apply knowledge representation tools and techniques developed in artificial intelligence and we estimate how philosophers as well as knowledge engineers could gain from this crossfertilization. This paper should be considered as an experiment report on the use of knowledge representation techniques in computer aided text analysis. It is part of a much broader project on the representation of conceptual structures in an expert system. However, we intentionally avoided technical issues related to either Computer Science or History of Philosophy to focus on the benefit to enhance traditional humanistic studies with tools and methods developed in AI on the one hand and the need to develop more appropriate tools on the other.
more …
By
Vellino, André
Post to Citeulike
1 Citations
In this paper we describe an improvement of Smullyan's analytic tableau method for the propositional calculusImproved Parent Clash Restricted (IPCR) tableauand show that it is equivalent to SLresolution in complexity.
By
Beavers, Gordon
Post to Citeulike
1 Citations
This paper is concerned with decision proceedures for the ℵ_{0}valued Łukasiewicz logics,. It is shown how linear algebra can be used to construct an automated theorem checker. Two decision proceedures are described which depend on a linear programming package. An algorithm is given for the verification of consequence relations in, and a connection is made between theorem checking in twovalued logic and theorem checking in which implies that determing of a ⊃free formula whether it takes the value one is NPcomplete problem.
more …
By
Pelletier, Francis J.
Post to Citeulike
THINKER is an automated natural deduction firstorder theorem proving program. This paper reports on how it was adapted so as to prove theorems in modal logic. The method employed is an “indirect semantic method”, obtained by considering the semantic conditions involved in being a valid argument in these modal logics. The method is extended from propositional modal logic to predicate modal logic, and issues concerning the domain of quantification and “existence in a world's domain” are discussed. Finally, we look at the very interesting issues involved with adding identity to the theorem prover in the realm of modal predicate logic. Various alternatives are discussed.
more …
By
Gent, Ian P.
Post to Citeulike
In this paper I give conditions under which a matrix characterisation of validity is correct for first order logics where quantifications are restricted by statements from a theory. Unfortunately the usual definition of path closure in a matrix is unsuitable and a less pleasant definition must be used. I derive the matrix theorem from syntactic analysis of a suitable tableau system, but by choosing a tableau system for restricted quantification I generalise Wallen's earlier work on modal logics. The tableau system is only correct if a new condition I call “alphabetical monotonicity” holds. I sketch how the result can be applied to a wide range of logics such as first order variants of many standard modal logics, including nonserial modal logics.
more …
By
Buchsbaum, Arthur; Pequeno, Tarcisio
Post to Citeulike
5 Citations
A proof method for automation of reasoning in a paraconsistent logic, the calculus C1* of da Costa, is presented. The method is analytical, using a specially designed tableau system. Actually two tableau systems were created. A first one, with a small number of rules in order to be mathematically convenient, is used to prove the soundness and the completeness of the method. The other one, which is equivalent to the former, is a system of derived rules designed to enhance computational efficiency. A prototype based on this second system was effectively implemented.
more …
By
Jiang, Yue J.
Post to Citeulike
1 Citations
One of the fundamental properties inclassical equational reasoning isLeibniz's principle of substitution. Unfortunately, this propertydoes not hold instandard epistemic logic. Furthermore,Herbrand's lifting theorem which isessential to thecompleteness ofresolution andParamodulation in theclassical first order logic (FOL), turns out to be invalid in standard epistemic logic. In particular, unlike classical logic, there is no skolemization normal form for standard epistemic logic. To solve these problems, we introduce anintensional epistemic logic, based on avariation of Kripke's possibleworlds semantics that need not have a constant domain. We show how a weaker notion of substitution through indexed terms can retain the Herbrand theorem. We prove how the logic can yield a satisfibility preserving skolemization form. In particular, we present an intensional principle for unifing indexed terms. Finally, we describe asound andcomplete inference system for a Horn subset of the logic withequality, based onepistemic SLDresolution.
more …
By
Caferra, Ricardo; Demri, Stéphane; Herment, Michel
Post to Citeulike
1 Citations
There exist valuable methods for theorem proving in non classical logics based on translation from these logics into firstorder classical logic (abbreviated henceforth FOL). The key notion in these approaches istranslation from aSource Logic (henceforth abbreviated SL) to aTarget Logic (henceforth abbreviated TL). These methods are concerned with the problem offinding a proof in TL by translating a formula in SL, but they do not address the very important problem ofpresenting proofs in SL via a backward translation. We propose a framework for presenting proofs in SL based on a partial backward translation of proofs obtained in a familiar TL: OrderSorted Predicate Logic. The proposed backward translation transfers some formulasF_{TL} belonging to the proof in TL into formulasF_{SL}, such that the formulasF_{SL}either (a) belong to a corresponding deduction in SL (in the best case) or, (b) are semantically related in some precise way, to formulas in the corresponding deduction in SL (in the worst case). The formulasF_{TL}andF_{SL}can obviously be considered aslemmas of their respective proofs. Therefore the transfer of lemmas of TL gives at least a skeleton of the corresponding proof in SL. Since the formulas of a proof “keep trace” of the strategy used to obtain the proof, clearly the framework can also help in solving another fundamental and difficult problem:the transfer of strategies from classical to non classical logics. We show how to apply the proposed framework, at least to S5, S4(p), K, T, K4. Two conjectures are stated and we propose sufficient (and in general satisfactory) conditions in order to obtain formulas in the proof in SL. Two particular cases of the conjectures are proved to be theorems. Three examples are treated in full detail. The main lines of future research are given.
more …
By
McKinnon, Alastair
Post to Citeulike
3 Citations
This paper describes the use of correspondence analysis to create the “space” of a book, constructs that of Kierkegaard'sFear and Trembling as an illustration, and distinguishes three separate contexts of some of its most important words: thespatial context (where the search word lies in that named and ordered space); theoverall context (the x words closest to the search word in multidimensional space); and the “role/sense” context (the words associated with the search word in each of its most important roles, some of which may represent new senses.) It describes the identification of these contexts, discusses their importance and concludes by noting certain respects in which the procedure might perhaps be improved.
more …
By
Nell, Sharon Diane
Post to Citeulike
Benoît de Cornulier's writings on French poetry concentrate on metrical boundaries, or caesura; however, the the criteria upon which he bases his analyses are useful in studying rhythm, or the relationship between syllables within the alexandrine's twohémistiches. This study focuses on three aspects of rhythm in French poetry: the definition of rhythm following Cornulier; the development of a method using the computer to detect rhythmic patterns in traditional isometrical alexandrines; the results of such a study when applied to three classical seventeenthcentury plays which are composed of isometrical alexandrines (Corneille'sPolyeucte, Racine'sPhèdre, and Molière'sLe Tartuffe).
more …
By
Pennington, Martha C.
Post to Citeulike
6 Citations
The potential of word processing for nonnative student writers is explored through an examination of effects in the categories of writing process, quantity and quality of writing, planning and prewriting, revising, conception of writing, cognitive processing, setting effects, and attitudes. It is maintained that nonnative writers may benefit in significant ways from the attributes of word processing and the conditions surrounding its use in composition instruction.
more …
By
Na, Younghee; Huck, G. J.
Post to Citeulike
5 Citations
Summary and Conclusion
We have demonstrated in this study that the island phenomena exhibited in Korean complex constructions, such as they are, follow from the strict application of the Argument Condition to the semantic interpretations of those constructions — and not from formal restrictions on the location of the antecedents of gaps. The AC was shown to entail a kind of subjaceny restriction, although it is immaterial to the AC whether a particular gap is locally bound in a clause as long as the head or topic of the clause can find another element of the appropriate type in the proper position in that clause. Longdistance dependencies may then be sanctioned simply by default.
An important assumption of this study is that the AC is a languagespecific condition that characterizes the way semantic rules apply to the particular structures produced by the syntactic rules of the Korean grammar; hence, we would not necessarily expect to find an identical condition in languages with markedly different syntaxes. For example, English does not admit Multiple Subject Constructions, and thus, whatever restrictions it places on the distribution of gaps, there can be no English equivalent of the Bclause of the AC. But, as we've seen, that clause is crucial in licensing longdistance dependencies in relatives and topic complements in Korean. If this is correct — and the evidence appears quite persuasive that it is — then the chief difference between Korean and English with respect to whether CNPC violations are tolerated consequently resides not in the typology of gaps in the syntactic structures produced by the two grammars, but rather in the possibility of forming such structures without gaps.
more …
By
Arnold, Doug; Sadler, Louisa; Humprheys, R. Lee
Post to Citeulike
3 Citations
The primary aim of this contribution is to provide an editorial introduction to this Special Issue ofMachine Translation dedicated to Evaluation. The intention is to describe the rationale for the Issue, outline the various contributions of the papers in this issue, and try to situate them in a wider context. As part of providing this wider context, we give an overview and assessment of the main current approaches to Evaluation of Natural Language Processing, and especially Machine Translation systems.
more …
By
Shiwen, Yu
Post to Citeulike
5 Citations
Automatic evaluation of output quality for Machine Translation (MT) systems is a difficult task. The Institute of Computational Linguistics of Peking University has developed an automatic evaluation system. This paper introduces the basic principles of the Machine Translation Evaluation (MTE) system and its implementation techniques, and describes the results achieved.
more …
By
Jordan, Pamela W.; Dorr, Bonnie J.; Benoit, John W.
Post to Citeulike
1 Citations
This paper describes a shortterm survey and evaluation project that covered a large number of machine translation products and research. We discuss our evaluation approach and address certain issues and implications relevant to our findings. We represented a variety of potential users of MT systems and were faced with the task of identifying which systems would best help them solve their translation problems.
more …
By
Nerbonne, John; Netter, Klaus; Diagne, Abdel Kader; Klein, Judith; Dickmann, Ludwig
Show all (5)
Post to Citeulike
1 Citations
In this paper we describe an ongoing effort to construct a catalogue of syntactic data exemplifying the major syntactic patterns of German. The purpose of the corpus is to support the diagnosis of errors in the syntactic components of natural language processing (NLP) systems. Secondary aims are the evaluation of NLP syntax components and support of theoretical and empirical work on German syntax.
The data consist of artificially and systematically constructed expressions, including also negative (ungrammatical) examples. The data are organized into a relational database and annotated with some basic information about the phenomena illustrated and the internal structure of the sample sentences. The organization of the data supports selected systematic testing of specific areas of syntax, but also serves the purpose of a linguistic database.
The paper first gives some general motivation for the necessity of syntactic precision in some areas of NLP and discusses the potential contribution of a syntactic database to the field of component evaluation. The second part of the paper describes the set up and control methods applied in the construction of the sentence suite and annotations to the examples. We illustrate the approach with examples from verbal government and sentential coordination. This section also contains a description of the abstract data model, the design of the database and the query language used to access the data. The final sections compare our work to existing approaches and sketch some future extensions.
We invite other research groups to participate in our effort, so that the diagnostics tool can eventually become public domain. Several groups have already accepted this invitation, and progress is being made.
more …
By
Arnold, Doug; Moffat, Dave; Sadler, Louisa; Way, Andrew
Show all (4)
Post to Citeulike
2 Citations
A Test Suite (TS) is typically a collection of Natural Language sentences against which the coverage of a Natural Language Processing system can be evaluated. We describe a method by which such suites can be produced automatically, involving a modification and extension of the Definite Clause Grammar formalism, and describe some of the advantages of the method over the traditional method of manual construction.
more …
By
Sasaki, Katsumi
Post to Citeulike
2 Citations
The simple substitution property provides a systematic and easy method for proving a theorem by an axiomatic way. The notion of the property was introduced in Hosoi [4] but without a definite name and he showed three examples of the axioms with the property. Later, the property was given it's name as above in Sasaki [7].
Our main result here is that the necessary and sufficient condition for a logicL on a finite slice to have the simple substitution property is thatL is finite. Here the necessity part is essentially new, for the sufficiency part has been proved in Hosoi and Sasaki [5]. Also the proof of sufficiency part is improved here.
For logics on the ωth slice, the condition for them to have the simple substitution property is not yet known.
We abbreviate the simple substitution property asSSP.
more …
By
Nettheim, Nigel
Post to Citeulike
1 Citations
Attention is drawn to the need for controlling (during encoding) and checking (after encoding) the quality or accuracy of musical data. Some large databases of melodies are now becoming available, and methods of control and checking are presented which are specially suited to these. Two applications are discussed in detail: to Gregorian Chant and to German folksong. An effective method in tonal and modal music is found to be the investigation of melodic progressions which remain unusual even after amalgamation by transposition to a central register.
more …
By
White, Richard B.
Post to Citeulike
6 Citations
This essay demonstrates prooftheoretically the consistency of a typefree theoryC with an unrestricted principle of comprehension and based on a predicate logic in which contraction (A → (A →B)) → (A →B), although it cannot holds in general, is provable for a wide range ofA's.C is presented as an axiomatic theoryCH (with a naturaldeduction equivalentCS) as a finitary system, without formulas of infinite length. ThenCH is proved simply consistent by passing to a Gentzenstyle naturaldeduction systemCG that allows countably infinite conjunctions and in which all theorems ofCH are provable.CG is seen to be a consistent by a normalization argument. It also shown that in a senseC is highly nonextensional.
more …
By
Rautenberg, Wolfgang
Post to Citeulike
3 Citations
It is shown that the class of reduced matrices of a logic ⊢ is a 1^{st} order ∀∃class provided the variety associated with ⊢ has the finite replacement property in the sense of [7]. This applies in particular to all 2valued logics. For 3valued logics the class of reduced matrices need not be 1^{st} order.
more …
By
Roudaud, Brigitte; Puerta, Maria Claudia; Gamrat, Otemea
Post to Citeulike
Evaluation of MT systems can be performed by developers or by endusers. In the following paper, we present an evaluation procedure (βtest) which involves both. It will be used to evaluate the ARIANE French to English system.
The quality of the MT system will be evaluated in terms of efficiency and upgrading capacity. Several thousands of pages will be translated by the system and revised by professional translators. This will give us the necessary figures to measure efficiency through cost saving. Problem sheets, filled in by the revisors, will be sent to the developers, who will use them to improve the MT system. Upgrading capacity will be evaluated with the help of these problem sheets and also by retranslating some parts of texts, at the end of the test, and comparing both versions.
more …
By
Olsen, Mark; McLean, Alice Music
Post to Citeulike
1 Citations
Optical Character Recognition is shown to be significantly more expensive than keyboarding, using offshore contractors, for entry of large amounts of text where high accuracy is required. Using large test samples in French and English, the paper indicates that OCR applications which require significant postscan editing are labor intensive projects that can be accomplished more efficiently by keyboarding. Most OCR systems are still not capable of entering large amounts of text accurately enough to avoid an expensive editing step.
more …
By
Hosoi, Tsutomu; Masuda, Isao
Post to Citeulike
The intermediate logics have been classified into slices (cf. Hosoi [1]), but the detailed structure of slices has been studied only for the first two slices (cf. Hosoi and Ono [2]). In order to study the structure of slices, we give a method of a finer classification of slices &_{n} (n ≥ 3). Here we treat only the third slice as an example, but the method can be extended to other slices in an obvious way. It is proved that each subslice contains continuum of logics. A characterization of logics in each subslice is given in terms of the form of models.
more …
By
Minnis, Stephen
Post to Citeulike
When surveying the many methods currently employed in MT evaluation,^{1} it is not immediately obvious that the methods used serve to increase the knowledge of the properties being measured. This report describes aconstructive machine translation evaluation method, aimed at addressing this issue.^{2}
more …
By
Thomas, JeanJacques
Post to Citeulike
The study of signs is divided between those scholars who use the Saussurian binary sign (semiology) and those who prefer Charles Peirce's tripartite sign (semiotics). The common view of the opposition between the two types of signs does not take into consideration the methodological conditions of applicability of these two types of signs. This is particularly important in the field of literary studies and hence for the preparation of electronic programs for text analysis. The Peircian sign explicitly entails the discovery of a truth of meaning that claims to be universal and not reducible to a collection of opinions based on fragmented information; it also imposes the task of elucidating a transhistorical and universal significantion encoded in a text. Contrary to Peirce's view of the sign, our use of computer programs for text analysis, however, demonstrates that we implicitly treat every literary text as a set of linguistic data (letters, phonemes, syntagmatic segments, etc.) which are reducible to units that can be treated separately. A brief comparison of the results obtained from computer analyses of the French poet Stéphane Mallarmé's text, “Le Cygne,” with those obtained from two Peircian analyses (by Riffaterre and Champigny) of the same text demonstrates that our current methods of computer textual analysis are based on a Saussurian semiology, which is unidimensional and limited, and that these methods are still quite unable to produce a semiotic interpretation based on a totalizing hierarchy of the text's various discursive components.
more …
By
Suzuki, NobuYuki
Post to Citeulike
4 Citations
Some properties of Kripkesheaf semantics for superintuitionistic predicate logics are shown. The concept ofpmorphisms between Kripke sheaves is introduced. It is shown that if there exists apmorphism from a Kripke sheaf κ_{1} into κ_{2} then the logic characterized by κ_{1} is contained in the logic characterized by κ_{2}. Examples of Kripkesheaf complete and finitely axiomatizable superintuitionistic (and intermediate) predicate logics each of which is Kripkeframe incomplete are given. A correction to the author's previous paper “Kripke bundles for intermediate predicate logics and Kripke frames for intuitionistic modal logics” (Studia Logica, 49(1990), pp. 289–306 ) is stated.
more …
