Showing 1 to 100 of 182 matching Articles
Results per page:
Export (CSV)
By
Arnold, Douglas
2 Citations
Conclusion
this is an interesting and informative book with much to recommend it. It covers a great deal of ground in discussion of ideas and presentation of an actual implementation, but I believe the major contribution to be in four areas:
 In presenting a system whose syntax is based on “principles and parameters”, Dorr provides an interesting challenge to the standard rulebased approaches which are broadly unificationbased.
 Dorr presents an interlingua which appears to have relatively solid linguistic motivation, and for which there is a very systematic mapping to and from text. This directly addresses two of the standard objections to interlingual approaches: arbitrariness and lack of systematicity. Unfortunately, the range of phenomena she considers is too limited to address the other major objection that is normally raised in relation to interlingual approaches: that of lack of coverage.
 Dorr presents a classification of translation divergences. I believe such a classification to be worthwhile, and I take this is a useful beginning. However, I find the actual classification proposed too broad, and theorydependent. Moreover Dorr's claims about completeness of the classification are not convincing.
 Dorr presents a solution to various translation divergences via parameterization of the interlingual representation. Here I believe reservations about conceptual coherence of the representation and the generality of the approach are appropriate.
more …
By
Say, B.; Akman, V.
4 Citations
Some recent studies in computational linguistics have aimed to take advantage of various cues presented by punctuation marks. This short survey is intended to summarise these research efforts and additionally, to outline a current perspective for the usage and functions of punctuation marks. We conclude by presenting an informationbased framework for punctuation, influenced by treatments of several related phenomena in computational linguistics.
Varol Akman is a professor of computer engineering at Bilkent University, Ankara, Turkey. From 1980 to 1985, he was a Fulbright scholar at Rensselaer Polytechnic Institute, Troy, New York, where he received a PhD degree in computer engineering. Prior to joining Bilkent in 1988, he held a senior researcher position with the Centrum voor Wiskunde en Informatica, Amsterdam, the Netherlands. His current research areas include artificial intelligence models of context, computational aspects of situation theory, and in general, language and philosophy.
more …
By
Hogenraad, Robert; McKenzie, Dean P.; Martindale, Colin
5 Citations
Many content analysis studies involving temporal data are biased by some unknown dose of autocorrelation. The effect of autocorrelation is to inflate or deflate the significant differences that may exist among the different parts of texts being compared. The solution consists in removing effects due to autocorrelation, even if the latter is not statistically significant. Procedures such as Crosbie's (1993) ITSACORR remove the effect of at least firstorder autocorrelations and can be used with small samples. The AREG procedure of SPSS (1994) and the AUTOREG procedure of SAS (1993) can be employed to detect and remove firstorder autocorrelations, and higherorder ones too in the case of AUTOREG, while several methods specifically intended for small samples (Huitema and McKean, 1991, 1994) have been developed. Four examples of content analysis studies with and without autocorrelation are discussed.
more …
By
Cunningham, Sally Jo
2 Citations
A common problem in anthropological field work is generalizing rules governing social interactions and relations (particularly kinship) from a series of examples. One class of machine learning algorithms is particularly wellsuited to this task: inductive logic programming systems, as exemplified by FOIL. A knowledge base of relationships among individuals is established, in the form of a series of singlepredicate facts. Given a set of positive and negative examples of a new relationship, the machine learning programs build a Horn clause description of the target relationship. The power of these algorithms to derive complex hypotheses is demonstrated for a set of kinship relationships drawn from the anthropological literature. FOIL extends the capabilities of earlier anthropologyspecific learning programs by providing a more powerful representation for induced relationships, and is better able to learn in the face of noisy or incomplete data.
more …
By
Allen, John Robin
The article briefly reviews the history of the ELSE program, one of the first ComputerAssisted Language (CALL) programs to use sophisticated error analysis. Its development evokes ten desiderata for such programs (not all of which are found in ELSE). The article, addressed to language teachers and to programmers working with CALL, describes those desiderata and the problems that ensue as one tries to implement them into a specific piece of software.
Allen is the Canadian representative to the Association for Literary and Linguistic Computing, wrote the constitution of the Association for Computers and the Humanities, was the Associate Editor of Computers and the Humanities, was the cofounder and coeditor of System (published by Pergamon Press and devoted to educational technology and applied linguistics) and is the founding editor of Olifant, (published by the Société Rencesvals and devoted to medieval epic literature in the romance languages).
more …
By
Merriam, Thomas
M.W.A. Smith's previous study (1985) of A. Q. Morton's stylometric use of word habits (collocations, positional words and word pairs) as variables for distinguishing authors concluded that the method was unreliable and the conclusions reached using it could not therefore be sustained. The conclusions referred to were the following: that the anonymous play Sir Thomas More, conventionally ascribed to Anthony Munday and others, was mostly written by Shakespeare; that the first two acts of Pericles, like the final three, were written by Shakespeare; that Henry VIII was confirmed as the work of at least two playwrights, William Shakespeare and John Fletcher. Multivariate analysis can now be applied to the disputed stylometric data to give a simpler and more definitive resolution. The result of principal component analysis indicates that the word habits in question reliably differentiate 36 entire Shakespeare plays from the nonShakespearean plays that were claimed to be indistinguishable from them.
more …
By
Greco, Gina L.; Paff, Toby; Shoemaker, Peter W.
This paper concerns the Charrette Project, a multimedia electronic archive of a medieval manuscript tradition. In this paper, we argue that the computer's strengths in manipulating complex and varied resources should be an important organizing principle in the conception and construction of electronic text projects. Specifically, we describe the elements of the Charrette archive, its architecture, and its potential for scholarly research and pedagogical applications.
Toby Paff is UNIX Systems Programmer at the Computing and Information Technology Center (CIT) at Princeton University. He has worked as Humanities Specialist and Manager of Information Access at CIT.
Peter W. Shoemaker is an advanced doctoral candidate in the Department of RLL at Princeton University. He is currently completing his dissertation, The Rhetoric of Patronage in SeventeenthCentury France.
more …
By
Verkuyl, H. J.; Vermeulen, C. F. M.
Topic of this paper is the way in which the structure of events features in discourse. We focus on the structure as introduced by verbs that express some sense of progress. First it is shown by means of examples that this structure is anaphorically available in discourse. Then we go on to discuss the different ways in which the same event may be structured within one discourse situation. We give formal representations of the crucial examples in manysorted dynamic logic.
more …
By
Shieber, Stuart M.; Pereira, Fernando C. N.; Dalrymple, Mary
12 Citations
Systematic semantic ambiguities result from the interaction of the two operations that are involved in resolving ellipsis in the presence of scoping elements such as quantifiers and intensional operators: scope determination for the scoping elements and resolution of the elided relation. A variety of problematic examples previously noted  by Sag, Hirschbüihler, Gawron and Peters, Harper, and others  all have to do with such interactions. In previous work, we showed how ellipsis resolution can be stated and solved in equational terms. Furthermore, this equational analysis of ellipsis provides a uniform framework in which interactions between ellipsis resolution and scope determination can be captured. As a consequence, an account of the problematic examples follows directly from the equational method. The goal of this paper is merely to point out this pleasant aspect of the equational analysis, through its application to these cases. No new analytical methods or associated formalism are presented, with the exception of a straightforward extension of the equational method to intensional logic.
more …
By
Cowen, Robert; Emerson, William
1 Citations
It is proved that a system of linear equations over an arbitrary field has a solution if every finite subsystem has a solution provided that the set of variables can be well ordered.
By
Vaggione, Diego
1 Citations
It is proved that the directly indecomposable algebras in a congruence modular equational class ν form a ∀∃∀ firstorder class provided that ν fulfils some two natural assumptions.
By
Amerbauer, Martin
7 Citations
We give sound and complete tableau and sequent calculi for the prepositional normal modal logics S4.04, K4B and G^{0}(these logics are the smallest normal modal logics containing K and the schemata □A → □□A, □A → A and □⋄□A → (□ → □A); □A → □□A and A→□⋄A; □A → □□A and □(□(A→ □A) → A) → □A resp.) with the following properties: the calculi for S4.04 and G^{0}are cutfree and have the interpolation property, the calculus for K4B contains a restricted version of the cutrule, the socalled analytical cutrule.
In addition we show that G^{0}is not compact (and therefore not canonical), and we proof with the tableaumethod that G^{0}is characterised by the class of all finite, (transitive) trees of degenerate or simple clusters of worlds; therefore G^{0}is decidable and also characterised by the class of all frames for G^{0}.
more …
By
Avellone, Alessandro; Fiorentini, Camillo; Mantovani, Paolo; Miglioli, Pierangelo
Show all (4)
1 Citations
We extend to the predicate frame a previous characterization of the maximal intermediate propositional constructive logics. This provides a technique to get maximal intermediate predicate constructive logics starting from suitable sets of classically valid predicate formulae we call maximal nonstandard predicate constructive logics. As an example of this technique, we exhibit two maximal intermediate predicate constructive logics, yet leaving open the problem of stating whether the two logics are distinct. Further properties of these logics will be also investigated.
more …
By
Herrmann, Burghard
24 Citations
The notion of an algebraizable logic in the sense of Blok and Pigozzi [3] is generalized to that of a possibly infinitely algebraizable, for short, p.i.algebraizable logic by admitting infinite sets of equivalence formulas and defining equations. An example of the new class is given. Many ideas of this paper have been present in [3] and [4]. By a consequent matrix semantics approach the theory of algebraizable and p.i.algebraizable logics is developed in a different way. It is related to the theory of equivalential logics in the sense of Prucnal and Wroński [18], and it is extended to nonfinitary logics. The main result states that a logic is algebraizable (p.i.algebraizable) iff it is finitely equivalential (equivalential) and the truth predicate in the reduced matrix models is equationally definable.
more …
By
Reynolds, Mark
10 Citations
We present an axiomatisation for the firstorder temporal logic with connectives Until and Since over the class of all linear flows of time. Completeness of the axiom system is proved.
We also add a few axioms to find a sound and complete axiomatisation for the first order temporal logic of Until and Since over rational numbers time.
more …
By
Kaneko, Mamoru; Nagashima, Takashi
17 Citations
This paper provides a logic framework for investigations of game theoretical problems. We adopt an infinitary extension of classical predicate logic as the base logic of the framework. The reason for an infinitary extension is to express the common knowledge concept explicitly. Depending upon the choice of axioms on the knowledge operators, there is a hierarchy of logics. The limit case is an infinitary predicate extension of modal propositional logic KD4, and is of special interest in applications. In Part I, we develop the basic framework, and show some applications: an epistemic axiomatization of Nash equilibrium and formal undecidability on the playability of a game. To show the formal undecidability, we use a term existence theorem, which will be proved in Part II.
more …
By
Insall, Matt
Using nonstandard methods, we generalize the notion of an algebraic primitive element to that of an hyperalgebraic primitive element, and show that under mild restrictions, such elements can be found infinitesimally close to any given element of a topological field.
more …
By
Laan, Twan; Nederpelt, Rob
5 Citations
The paper first formalizes the ramified type theory as (informally) described in the Principia Mathematica [32]. This formalization is close to the ideas of the Principia, but also meets contemporary requirements on formality and accuracy, and therefore is a new supply to the known literature on the Principia (like [25], [19], [6] and [7]).
As an alternative, notions from the ramified type theory are expressed in a lambda calculus style. This situates the type system of Russell and Whitehead in a modern setting. Both formalizations are inspired by current developments in research on type theory and typed lambda calculus; see [3].
more …
By
Prijatelj, Andreja
19 Citations
In this paper, we consider multiplicativeadditive fragments of affine propositional classical linear logic extended with ncontraction. To be specific, ncontraction (n ⩾ 2) is a version of the contraction rule where (n+ 1) occurrences of a formula may be contracted to n occurrences. We show that expansions of the linear models for (n + 1) valued Łukasiewicz logic are models for the multiplicativeadditive classical linear logic, its affine version and their extensions with ncontraction. We prove the finite axiomatizability for the classes of finite models, as well as for the class of infinite linear models based on the set of rational numbers in the interval [0, 1]. The axiomatizations obtained in a Gentzenstyle formulation are equivalent to finite and infinitevalued Łukasiewicz logics.
more …
By
Buszkowski, Wojciech
12 Citations
We prove the finite model property (fmp) for BCI and BCI with additive conjunction, which answers some open questions in Meyer and Ono [11]. We also obtain similar results for some restricted versions of these systems in the style of the Lambek calculus [10, 3]. The key tool is the method of barriers which was earlier introduced by the author to prove fmp for the productfree Lambek calculus [2] and the commutative productfree Lambek calculus [4].
more …
By
Dillon, Lisa Y.
4 Citations
The comparative use of census data is a useful way to study social characteristics across national boundaries. However, truly comparative demographic history is not possible without fully integrating separate census data, uniting multiple data files with a common set of comparably coded variables. This paper describes the integration of the 1871 Canadian census public use sample with similar samples of the 1850 and 1880 American censuses to form the Integrated CanadianAmerican Public Use Microdata Series (ICAPUMS). These data sets lent themselves well to integration because of their strong similarities in sampling design, data collection and data organization. Consistency in the availability and treatment of variables also eased integration of the samples, although the harmonization of occupation variables presented significant challenges. The ICAPUMS features a general household relationship variable which allows us to examine household structure across the two countries and three years. The paper concludes by proposing some general principles of census data set integration. This integrated data set is now available to researchers on the website of the University of Minnesota Historical Census Projects (www.hist.umn.edu/~ipums).
more …
By
Sigelman, Lee; Martindale, Colin; McKenzie, Dean
1 Citations
The extraordinary impact of Thomas Paine's Common Sense has often been attributed to its style — to the simplicity and forcefulness with which Paine expressed ideas that many others before him had expressed. Comparative analysis of Common Sense and other preRevolutionary pamphlets suggests that Common Sense was indeed stylistically unique; no other pamphleteer came close to matching Paine's combination of simplicity and forcefulness.
more …
By
Greenstein, Daniel
This historiographical article surveys the different developmental trajectories of computeraided historical research and teaching in Western Europe and in the United States, and seeks synergies which promise to enhance the discipline.
By
Brown, Mark A.
3 Citations
Normal systems of modal logic, interpreted as deontic logics, are unsuitable for a logic of conflicting obligations. By using modal operators based on a more complex semantics, however, we can provide for conflicting obligations, as in [9], which is formally similar to a fragment of the logic of ability later given in [2], Having gone that far, we may find it desirable to be able to express and consider claims about the comparative strengths, or degrees of urgency, of the conflicting obligations under which we stand. This paper, building on the formalism of the logic of ability in [2], provides a complete and decidable system for such a language.
more …
By
Alchourrón, Carlos E.
16 Citations
The purpose of the paper is to present a logical framework that allow to formalize a kind of prima facie duties, defeasible conditional duties, indefeasible conditional duties and actual (indefeasible) duties, as well as to show their logical interconnections.
more …
By
Prakken, Henry; Sergot, Marek
107 Citations
We investigate under what conditions contrarytoduty (CTD) structures lacking temporal and action elements can be given a coherent reading. We argue, contrary to some recent proposals, that CTD is not an instance of defeasible reasoning, and that methods of nonmonotonic logics are inadequate since they are unable to distinguish between defeasibility and violation of primary obligations. We propose a semantic framework based on the idea that primary and CTD obligations are obligations of different kinds: a CTD obligation pertains to, or presupposes, a certain context in which a primary obligation is already violated. This framework is presented initially as an extension of Standard Deontic Logic (SDL), a normal modal logic of type KD, and is illustrated by application to a series of examples. The concluding section is concerned with some resemblances between CTD and defeasible reasoning. We show first that the SDLbased framework contains a flaw and must be adjusted. A discussion of possible adjustments, including an alternative treatment in terms of a preferencebased semantics, reveals difficulties that are reminiscent of problems in defeasible reasoning and intensional accounts of defeasible conditionals.
more …
By
Morreau, Michael
5 Citations
Sir David Ross introduced prima facie duties, or acts with a tendency to be duties proper. He also spoke of general prima facie principles, wwhich attribute to acts having some feature the tendency to be a duty proper. Like Utilitarians from Mill to Hare, he saw a role for such principles in the epistemology of duty: in the process by means of which, in any given situation, a moral code can help us to find out what we ought to do.
After formalizing general prima facie principles as universally quantified conditionals I will show how seeming duties can be detached from them. There will be examples involving lies, burnt offerings and the question of whether to have a napkin on your lap while eating asparagus. They will illustrate the defeasibility of this detachment, how it can lead into dilemmas, and how general prima facie principles are overridden by more specific ones.
more …
By
Carmo, José; Jones, Andrew J. I.
11 Citations
The paper discusses the potential value of a deontic approach to database specification. More specifically, some different types of integrity constraints are considered and a distinction is drawn between necessary (“hard”) and deontic (“soft”) constraints.
Databases are compared with other normative systems. A deontic logic for database specification is proposed and the problems of how to react to, and of how to correct, or repair, a situation which arises through norm violation are discussed in the context of this logic. The limitations of the proposed logic and possible modifications and extensions of it are analysed.
more …
By
Dignum, F.; Meyer, J. J. Ch.; Wieringa, R. J.
17 Citations
We present a solution to the paradox of free choice permission by introducing strong and weak permission in a deontic logic of action. It is shown how counterintuitive consequences of strong permission can be avoided by limiting the contexts in which an action can be performed. This is done by introducing the only operator, which allows us to say that only α is performed (and nothing else), and by introducing contextual interpretation of action terms.
more …
By
Prakken, Henry
16 Citations
This paper compares two ways of formalising defeasible deontic reasoning, both based on the view that the issues of conflicting obligations and moral dilemmas should be dealt with from the perspective of nonmonotonic reasoning. The first way is developing a special nonmonotonic logic for deontic statements. This method turns out to have some limitations, for which reason another approach is recommended, viz. combining an already existing nonmonotonic logic with a deontic logic. As an example of this method the language of Reiter's default logic is extended to include modal expressions, after which the argumentation framework in default logic of [20, 22] is used to give a plausible logical analysis of moral dilemmas and prima facie obligations.
more …
By
Asher, Nicholas; Bonevac, Daniel
9 Citations
This paper presents a nonmonotonic deontic logic based on commonsense entailment. It establishes criteria a successful account of obligation should satisfy, and develops a theory that satisfies them. The theory includes two conditional notions of prima facie obligation. One is constitutive; the other is epistemic, and follows nonmonotonically from the constitutive notion. The paper defines unconditional notions of prima facie obligation in terms of the conditional notions.
more …
By
McNamara, Paul
9 Citations
On the traditional deontic framework, what is required (what morality demands) and what is optimal (what morality recommends) can't be distinguished and hence they can't both be represented. Although the morally optional can be represented, the supererogatory (exceeding morality's demands), one of its proper subclasses, cannot be. The morally indifferent, another proper subclass of the optionalone obviously disjoint from the supererogatoryis also not representable. Ditto for the permissibly suboptimal and the morally significant. Finally, the minimum that morality allows finds no place in the traditional scheme. With a focus on the question, “What would constitute a hospitable logical neighborhood for the concept of supererogation?”, I present and motivate an enriched logical and semantic framework for representing all these concepts of common sense morality.
more …
By
Baayen, R. Harald; Lieber, Rochelle
This paper addresses the relation between meaning, lexical productivity, and frequency of use. Using density estimation as a visualization tool, we show that differences in semantic structure can be reflected in probability density functions estimated for word frequency distributions. We call attention to an example of a bimodal density, and suggest that bimodality arises when distributions of wellentrenched lexical items, which appear to be lognormal, are mixed with distributions of productively created nonce formations.
more …
By
Lokhorst, Gert Jan C.
6 Citations
We describe a new way in which theories about the deontic status of actions can be represented in terms of the standard twosorted firstorder extensional predicate calculus. Some of the resulting formal theories are easy to implement in Prolog; one prototype implementation—R. M. Lee's deontic expert shell DX—is briefly described.
more …
By
Simons, Gary F.
Debate has long been a hallmark of the academic endeavor. The recent introduction of computers into academic life has not been the deus ex machina to bring sudden resolution to these debates. There is a new computing technology, however, that has some promise in this regard. It is called conceptual modeling. This paper' demonstrates how a computerbased model of a problem domain can lead to consensus when competing approaches to the domain can be encapsulated in different visual models that are applied to the same underlying conceptual model.
more …
By
Levy, Mike
In a paper written in 1987 entitled “Computers and the Humanities Courses: Philosophical Bases and Approaches” Nancy Ide put forward two views on teacher education in humanities computing, the “Expert User's View” and the “Holistic View”.^{1} Ide's two views are derived from the collective opinions given by members of a workshop on teaching computing and humanities courses. In this article the degree to which Ide's two Views can be substantiated in ComputerAssisted Language Learning (CALL) is explored, through a review of the literature and through an international survey on CALL materials development conducted by the author in 1991 (Levy, 1994). On this basis, and given the scarcity of Holistic courses in CALL, a rationale for a CALL course with a holistic orientation is presented.
more …
By
Shirahata, Masaru
4 Citations
In this paper, we develop the system LZF of set theory with the unrestricted comprehension in full linear logic and show that LZF is a conservative extension of ZF^{−}i.e., the ZermeloFraenkel set theory without the axiom of regularity. We formulate LZF as a sequent calculus with abstraction terms and prove the partial cutelimination theorem for it. The cutelimination result ensures the subterm property for those formulas which contain only terms corresponding to sets in ZF^{−}. This implies that LZF is a conservative extension of ZF^{−} and therefore the former is consistent relative to the latter.
more …
By
Hoek, Wiebe; Jaspars, Jan; Thijsse, Elias
6 Citations
We propose an epistemic logic in which knowledge is fully introspective and implies truth, although truth need not imply epistemic possibility. The logic is presented in sequential format and is interpreted in a natural class of partial models, called balloon models. We examine the notions of honesty and circumscription in this logic: What is the state of an agent that ‘only knows ϕ’ and which honest ϕ enable such circumscription? Redefining stable sets enables us to provide suitable syntactic and semantic criteria for honesty. The rough syntactic definition of honesty is the existence of a minimal stable expansion, so the problem resides in the ordering relation underlying minimality. We discuss three different proposals for this ordering, together with their semantic counterparts, and show their effects on the induced notions of honesty.
more …
By
Merriam, Thomas
Starting from the accepted premise that Marlowe influenced the young Shakespeare, a selection of strongly contextual words characteristic of Tamburlaine are shown to be reflected in Shakespeare's early history plays. Principal components analysis confirms this. A very similar configuration, however, results when the selected Marlowepreferred words are noncontextual common words. This feature can not be explained by influence in its conventional sense, particularly when the Shakespeare plays closest to Marlowe are those that share with Marlowe a dearth of selected Shakespearepreferred common words.
Greene, Kyd and Peele are tested to see if they attract to themselves the same plays as those drawn by Marlowe. Marlowe is seen to be the overriding magnet in all cases.
Both conventional and noncontextual words link ten Marlowe works, even though seven are plays, and two of the ten are translations from Latin. Eight Shakespeare plays can be distinguished as somewhat separate from the other 28 Shakespeare plays and having an affinity with the Marlowe works.
more …
By
Foster, Donald W.
2 Citations
In “And Then There Were None,” Ward Elliot and Robert Valenza report on the work of the Shakespeare Clinic (Claremont McKenna Colleges, 1987–1995). Working from popular theories that William Shakespeare is not the true author of the plays and poems ascribed to him, Elliot and Valenza cast a broad net to find another writer whose distinctive linguistic features match those of the Shakespeare canon. A regime of 51 tests was designed whereby to compare Shakespeare's drama with 79 nonShakespearean (or at least noncanonical) plays. Success rates at or near 100% are reported for the ElliotValenza tests in distinguishing Shakespeare from nonShakespeare. A smaller battery of tests was designed for distinguishing Shakespeare poems from nondramatic texts by other poets, with similar success rates being reported. But many of the ElliotValenza tests are deeply flawed, both in their design and execution.
more …
By
Giuntini, Roberto
21 Citations
We introduce the notion of quantum MV algebra (QMV algebra) as a generalization of MV algebras and we show that the class of all effects of any Hilbert space gives rise to an example of such a structure. We investigate some properties of QMV algebras and we prove that QMV algebras represent nonidempotent extensions of orthomodular lattices.
more …
By
Kroon, F. W.
This paper deals with a philosophical question that arises within the theory of computational complexity: how to understand the notion of INTRINSIC complexity or difficulty, as opposed to notions of difficulty that depend on the particular computational model used. The paper uses ideas from Blum's abstract approach to complexity theory to develop an extensional approach to this question. Among other things, it shows how such an approach gives detailed confirmation of the view that subrecursive hierarchies tend to rank functions in terms of their intrinsic, and not just their modeldependent, difficulty, and it shows how the approach allows us to model the idea that intrinsic difficulty is a fuzzy concept.
more …
By
Kanovei, Vladimir; Reeken, Michael
6 Citations
In this article ^{‡} we show how the universe of HST, Hrbaček set theory (a nonstandard set theory of “external” type, which includes, in particular, the ZFC Replacement and Separation schemata for all formulas in the language containing the membership and standardness predicates, and Saturation for “standard size” families of internal sets, but does not include the Power set axiom) admits a system of subuniverses which keep the Replacement, model Power set and Choice (in fact all of ZFC, with the exception of the Regularity axiom, which indeed is replaced by the Regularity over the internal subuniverse), and also keep as much of Saturation as it is necessary.
This gives sufficient tools to develop the most complicated topics in nonstandard analysis, such as Loeb measures.
more …
By
Elliott, Ward E. Y.; Valenza, Robert J.
12 Citations
The Shakespeare Clinic has developed 51 computer tests of Shakespeare play authorship and 14 of poem authorship, and applied them to 37 claimed “true Shakespeares,” to 27 plays of the Shakespeare Apocrypha, and to several poems of unknown or disputed authorship. No claimant, and none of the apocryphal plays or poems, matched Shakespeare. Two plays and one poem from the Shakespeare Canon,Titus Andronicus, Henry VI, Part 3, and “A Lover's Complaint,” do not match the others.
more …
By
Whissell, Cynthia
20 Citations
Traditional stylometric measures such as word usage, word length, and word repetition were paired with six new measures that described word emotionality in terms of a word's pleasantness, its activation level, and the combination of these factors. All measurements were applied to the songs composed by Beatles Paul McCartney and John Lennon between 1962 and 1970. Stylistic and emotional differences between composers and across years were found to be in agreement with observations made by critics and analysts of Beatles' songs, suggesting that emotional stylometry is a valid instrument for the analysis of text. Lennon was the less pleasant and the sadder lyricist, and the LennonMcCartney lyrics became less pleasant, less active, and less cheerful over time. Several other differences were observed and reported. A technique for summary emotional description (the emotion clock) was also introduced.
more …
By
Hirokawa, Sachio; Komori, Yuichi; Takeuti, Izumi
4 Citations
A reduction rule is introduced as a transformation of proof figures in implicational classical logic. Proof figures are represented as typed terms in a λcalculus with a new constant P^{((α→β)→α)}. It is shown that all terms with the same type are equivalent with respect to βreduction augmented by this Preduction rule. Hence all the proofs of the same implicational formula are equivalent. It is also shown that strong normalization fails for βPreduction. Weak normalization is shown for βPreduction with another reduction rule which simplifies α of ((α → β) → α) → α into an atomic type.
more …
By
Saeboe, Kjell Johan
6 Citations
The purpose of this paper is to use an anaphoric notion of presupposition for solving the problem of zero argument anaphora. Since Shopen (1973) it has been known that many missing arguments have an anaphoric interpretation, but it has not been known how this interpretation arises. I argue that these arguments are involved in presuppositions. On an anaphoric account of presuppositions as in van der Sandt (1992) or Kamp and Roßdeutscher (1992), it can be shown that the zero arguments acquire an anaphoric interpretation through the presuppositions. The analysis rests on the principle that the Discourse Representation Structure for the presupposition is proper, so that the discourse referents for the zero arguments are in its universe and must be anchored to discourse referents in the context.
more …
By
Wanner, Leo
5 Citations
We present in this article the state of the art in lexical choice research in text generation and machine translation. The existing implementations are discussed with respect to four topics: (i) the place of lexical choice in the overall generation process, (ii) the information flow within the generation process and the consequences thereof for lexical choice, (iii) the internal organization of the lexical choice process, (iv) the phenomena covered by lexical choice. Possible future directions in lexical choice research are identified.
more …
By
Stede, Manfred
We describe the architecture of a sentence generation module that maps a languageneutral “deep” representation to a languagespecific sentencesemantic specification, which is then handed over to a conventional frontend generator. Lexicalization is seen as the main task in the mapping step, and we specifically examine the role of verb semantics in the process. By separating the various kinds of knowledge involved, for related languages (such as English and German) the task of multilingual sentence generation can be treated as a variant of the monolingual paraphrasing problem.
more …
By
Horacek, Helmut
3 Citations
Recently, the scientific interest in addressing metonymy phenomena from a computational perspective has increased significantly. Considerable effort is invested in this, but issues addressing metonymy in the context of natural language generation have been widely ignored so far, and also comparable multilingual analyses are rather sparse. Motivated by these shortcomings, we investigate methods for representing knowledge required to express metonymic relations in several ways and in multiple languages, and we present techniques for generating these alternative verbalizations. In particular, we demonstrate how mapping schemata that enable lexical expressions on the basis of conceptual specifications to be built are derived from the Qualia Structure of Pustejovsky's Generative Lexicon. Moreover, our enterprise has led to the exposition of interesting crosslanguage differences, notably the use of prefixed verbs and compound nouns in German, as opposed to widely equivalent expressions entailing implicit metonymic relations, as frequently found in English. A main achievement of our approach lies in bringing computational lexical semantics and natural language generation closer together, so that the linguistic foundations of lexical choice in natural language generation are strengthened.
more …
By
Elhadad, Michael
This paper presents a lexical choice component for complex noun phrases. We first explain why lexical choice for NPs deserves special attention within the standard pipeline architecture for a generator. The task of the lexical chooser for NPs is more complex than for clauses because the syntax of NPs is less understood than for clauses, and therefore, syntactic realization components, while they accept a predicateargument structure as input for clauses, require a purely syntactic tree as input for NPs. The task of mapping conceptual relations to different syntactic modifiers is therefore left to the lexical chooser for NPs.
The paper focuses on the syntagmatic aspect of lexical choice, identifying a process called “NP planning”. It focuses on a set of communicative goals that NPs can satisfy and specifies an interface between the different components of the generator and the lexical chooser.
The technique presented for NP planning encapsulates a rich lexical knowledge and allows for the generation of a wide variety of syntactic constructions. It also allows for a large paraphrasing power because it dynamically maps conceptual information to various syntactic slots.
more …
By
Dorr, Bonnie J.; Olsen, Mari Broman
4 Citations
Multilingual generation in machine translation (MT) requires a knowledge organization that facilitates the task of lexical choice, i.e. selection of lexical units to be used in the generation of a targetlanguage sentence. This paper investigates the extent to which lexicalization patterns involving the lexical aspect feature [+telic] may be used for translating events and states among languages. Telicity has been correlated syntactically with both transitivity and unaccusativity, and semantically with Talmy's ‘path’ of a motion event, the representation of which characterizes languages parametrically.
Taking as our starting point the syntactic/semantic classification in Levin's English Verb Classes and Alternations, we examine the relation between telicity and the syntactic contexts, or alternations, outlined in this work, identifying systematic relations between the lexical aspect features and the semantic components that potentiate these alternations. Representing lexical aspect — particularly telicity — is therefore crucial for the tasks of lexical choice and syntactic realization. Having enriched the data in Levin (by correlating the syntactic alternations (Part I) and semantic verb classes (Part II) and marking them for telicity) we assign to verbs lexical semantic templates (LSTs). We then demonstrate that it is possible from these templates to build a largescale repository for lexical conceptual structures which encode meaning components that correspond to different values of the telicity feature. The LST framework preserves both semantic content and semantic structure (following Grimshaw during the processes of lexical choice and syntactic realization. Application of this model identifies precisely where the Knowledge Representation component may profitably augment our rules of composition, to identify cases where the interlingua underlying the source language sentence must be either reduced or modified in order to produce an appropriate target language sentence.
more …
By
Mehl, Stephan
1 Citations
The process of lexical choice usually consists of determining a single way of expressing a given content. In some cases such as gerund translation, however, there is no single solution; a choice must be made among several variants which differ in their syntactic behavior. Based on a bilingual corpus analysis, this paper explains first which factors influence the availability of variants. In a second step, some criteria for deciding on one or the other variant are discussed. It will be shown that the stylistic evaluation of the syntactic structures induced by alternative lexical items is of central importance in lexical choice. Finally, an implementation of the resulting model is described.
more …
By
Johnson, Eric
Many university students are eager to use computers to analyze and compare texts and to do various kinds of computerized literary research. If students themselves cannot create the software they want to use, and if the desired software is not available from commercial sources, a professor can be of great assistance by writing computer programs for students. Fifteen professorcreated programs for text analysis are described.
more …
By
Davis, Charles T., III
2 Citations
Interpretation of literary texts is a multidimensional task requiring students to master a variety of skills and to acquire factual knowledge in diverse areas. The use of the World Wide Web in conjunction with student computer accounts has allowed me to create a virtual classroom in which students can explore many historical and theoretical aspects of interpretation through the use of HTML tutorials and Web resources. The discussion of the literary text in a community of scholars can take center stage in the physical classroom.
more …
By
Birkenstock, SusanMarie
For the most part, students entering my course in Writing About Literature  and the world of literary analysis —believe that literary works are dead, inflexible, and univocal. Faced with these recurring attitudes about literature, I discovered that the MOO, a textbased virtual reality, is the ideal environment for a literature course. In the MOO the text remains “alive” while insisting upon its diverse readings. Initially the MOO is used for group discussions about the literature being studied. At semester's end the MOO becomes the site of a group project in which the students enter cyberspace as characters they have created. Here they collaboratively write and perform a script as those characters — an act I call “performance scripting.”
more …
By
Boaz, John K.; Boaz, Mildred M.
1 Citations
The authors developed a CDROM for use in the classroom, the computer lab, the library, or the personal computers of students and faculty. The subject of the CD is an interart study of music, art, and literature, specifically T. S. Eliot's The Waste Land. the authors trace their process of formulating the idea and conceptualizing the project. They also detail the initial stages of the project, including the timeintensive efforts of getting permissions for the materials. For the technology portion, they describe how the components were digitized, integrated, and tested. Finally, they discuss how they manufactured and marketed the product, closing with some evaluative comments.
more …
By
Mills, Jon; Chandramohan, Balasubramanyam
We used TACT computer software to teach Joseph Conrad's novel Heart of Darkness to BA (Hops) students at the University of Luton in England. Conrad's novel is one of the texts used in the ‘Language and New Literatures’ modules (units). In these modules we combine analytical approaches to literary texts with linguistic methods. We used TACT to reinforce the understanding of the text of Heart of Darkness achieved through such a combination of methods. An exposure to the computerbased approaches to the text described in this article made the students' interaction with the text a more complex and rewarding experience.
more …
By
Havholm, Peter; Stewart, Larry
1 Citations
The explicit consideration of literary theory has become increasingly important both in the field of textual studies generally and in undergraduate literature courses. But theory can seem vague and inconsequential to undergraduates. Our students use hypertext to model intertextuality and the Linear Modeling Kit (a software program we have developed) to model structuralist ideas about narrative. In making computer models, students explore the implications of analytic ideas by attempting to represent them in formal (in the sense of programmable) terms. Our experience shows that such modeling stimulates student questioning and discussion of marked precision and sophistication.
more …
By
Potter, Rosanne G.
This summary essay comments on the contents of and issues raised by the special number of Computers and Humanities on Computers and the Teaching of Literature. It argues against the use of hypertextual resources without careful pedagogical understanding of the dangers they present of encouraging students to become passive consumers rather than active thinkers. It argues for the use of computermediated conversation, computermodeling, and computer analyses of texts as appropriate applications in the literature classroom.
more …
By
Katz, Seth R.
2 Citations
Literature instructors are using hypertext to enhance their teaching in a broad variety of ways that includes putting course materials on the WWW; creating online tutorials; using annotated hypertexts in addition to or in lieu of print texts; having students write hypertexts; examining the medium of hypertext as a literary and cultural theme; and studying hypertext fiction in the context of traditional literature classes. The article describes examples of each of these uses of hypertext in teaching literature and provides sources of further examples of and information on using hypertext as a teaching tool in literature classes.
more …
By
Smith, Jonathan
1 Citations
The essay describes the use of George P. Landow's hypertext, The Dickens Web, in an advanced undergraduate literature class and analyzes its practical and theoretical implications. Hypertext is shown to encourage active student engagement, especially with contextual material; to lead to more focused research topics; and to facilitate student collaboration. Some of Landow's claims about the ease with which this occurs, however, are questioned. The difficulty of teaching students how to follow and construct conceptual hypertextual links is examined, and the instructor's role in relation to student contributions to the Web is presented as much more problematic than Landow allows.
more …
By
Jamieson, Marguerite; Kajs, Rebecca; Agee, Anne
1 Citations
When students use computers as learning tools, the whole process of learning, and, indeed, the learners themselves, are transformed. This article illustrates some techniques that foster transformative learning in computerassisted firstyear literature classes: first, a lesson plan on “A Valediction: Forbidding Mourning” that uses Microsoft Word functions, including format painter, tables, and annotation to explore meaning in context; second, a plan for learners to use subconference options in the Daedalus Interactive Writing Environment to analyze Oedipus Rex; finally, a demonstration of how students engage in a metareflection process as they explore “Barn Burning” with Freelance Graphics.
more …
By
Lascarides, Alex; Briscoe, Ted; Asher, Nicholas; Copestake, Ann
Show all (4)
18 Citations
We define an order independent version of default unification on typed feature structures. The operation is one where default information in a feature structure typed with a more specific type, will override default information in a feature structure typed with a more general type, where specificity is defined by the subtyping relation in the type hierarchy. The operation is also able to handle feature structures where reentrancies are default. We provide a formal semantics, prove order independence and demonstrate the utility of this version of default unification in several linguistic applications. First, we show how it can be used to define multiple orthogonal default inheritance in the lexicon in a fully declarative fashion. Secondly, we show how default lexical specifications (introduced via default lexical inheritance) can be made to usefully ‘persist beyond the lexicon’ and interact with syntagmatic rules. Finally, we outline how persistent default unification might underpin default feature propagation principles and a more restrictive and constraintbased approach to lexical rules.
more …
By
Kettunen, Kimmo
This article focuses on typographical spellchecking. Typographical spellchecking verifies the use of characters such as ? !  ; : \sp $ @ and other special purpose characters in respect to spaces or null elements. The author claims that this kind of spellchecking has not been developed to any substantial degree, although it could be of considerable practical use as writer's aid. The article discusses the basic challenges of typographical spellchecking and shows that some of the difficulties are greater than might be expected at first sight. Rules for describing the behavior of typographical characters are proposed.
more …
