Organiser: Uta Priss
Temporal Concept Analysis in SIENA
Karl Erich Wolff
(Ernst Schröder Center for Conceptual Knowledge Processing, Darmstadt)
The tutorial will enable users in understanding their temporal data by representing these data first in CERNATO, scaling the data and generating suitable views which can be visualized in the Temporal Concept Analysis tool in SIENA (a part of TOSCANAJ) such that life tracks ("trajectories") of several objects can be studied simultaneously in "transition diagrams". The more general possibility of representing moving distributed objects (e.g. a high pressure zone) is explained by examples showing also the applicability of this method for representing temporal fuzzy data.
Data analysis with the Formal Concept Analysis Research Toolbox (FCART)
Alexey Neznanov and Dmitry Ilvovsky
(School of Applied Mathematics and Information Sciences, NRU HSE, Russia)
We consider a full research cycle of data analysis using FCA-based
techniques and methods. We go through this cycle with a new software called Formal
Concept Analysis Research Toolbox (FCART). The
demonstration includes the following stages:
1) accessing an external data source and loading data;
2) preprocessing data and building multi-valued context;
3) dynamical scaling of multivalued context;
4) working with binary context;
4.1) constructing lattice and visualizing line diagram;
4.2) building sublattices;
4.3) filtering source lattice and searching for formal concepts;
4.4) finding association rules and other related artifacts;
4.5) saving lattice as graph or image;
5) building reports.
Towards (re)construction of the theory of linguistic oppositions within the
framework of Interactive Linguistics
André Wlodarczyk
(Paris, Sorbonne)
1. Linguistic typology practice
The most characteristic difference between formal and typological linguistics is
perhaps the fact that while the former starts with the thesis that "natural
languages are like formal ones", the latter claim that "natural languages are
ambiguous". Therefore, formal linguistics is deductive while typological linguistics
is inductive.
2. Need for modelling with KDD methodology using the "Semana" implementation of FCA.
Nevertheless, a few examples suffice to understand that efforts of some typologists
to use formal(ised) meta-languages in order to explain their observations of natural
language phenomena fail because of the lack of means to verify models they pursue.
I will try to show some of our results of research aiming at adapting some aspects of FCA. However, these results are rather poor because either they are still in research-and-development phase or focus only on the rediscovered by us but well known mathematical concepts such as Real and Ideal. Indeed, our concern is not only application of FCA to linguistic objects but also adaptation of FCA algorithms to what we feel could enable us to better explain natural language phenomena.
In the same spirit, we implemented an algorithm of approximating formal concepts. However, besides our purely intellectual satisfaction and hypothetical usefulness for machine 'understanding' of natural language expressions, we did not find any practical application of the Rough FCA to linguistics, as yet.
The above problems will be mentioned showing examples of interaction with Semana, because I really seek how to use properly the FCA technology in order to discover reliable knowledge about the natural language 'systems'.
3. FCA view on natural language
Due to the ambiguous nature of linguistic signs, typologists need to describe
linguistic units in contexts of their use. Therefore, they have to aim at defining
usages of units instead of the units themselves. The question is what represent
Formal Concepts in such specific Contexts. To answer this question, it is especially
important to study the relationships (oppositions) between grammatical units because
of their abstract orthogonality.
In Search for Meaning, Values, Aims, Measures by the GABEK Methodology
Josef Zelger
(Univ.-Prof. i. R., Department of Philosophy, University of Innsbruck)
GABEK ® (Ganzheitliche Bewältigung von Komplexität - Holistic Coping with Complexity) is a qualitative method for opinion research, knowledge processing, and organization development. It allows verbal data to be represented as conceptual nets, where lexical concepts are the nodes and statements the edges. Conceptual nets can be reduced in complexity and can be used for the selection of core topics as well for concept analysis and ontology generation. In a second step through coding of evaluations expressed in the texts by the respondents, systems of values and goals are analyzed in an intersubjective and reconstructable way. In the third step of analysis causal assumptions of the respondents are coded and displayed graphically. New fields and potentials of action are thus opened up. Finally by construction of a "Gestalten-tree" the verbal data are organized in a logical and hierarchical order. It provides a structured overview and furthers holistic understanding.
Exploratory Programming for Formal Concept Analysis – An Introduction
to conexp-clj
Daniel Borchmann
(TH Dresden)
We give a gentle introduction to conexp-clj, a general purpose tool for formal concept analysis. To this end, we shall introduce basic notions and design principles of the program and its underlying programming language, Clojure, as well as discuss examples, both simple and complex, that can be handled by the conexp-clj.