Štruktúra a jej využite pre rozpoznávanie

Publikácie

Autori Názov
S. Basovník, F. Mráz Learning Limited Context Restarting Automata by Genetic Algorithms (Technical Report)
Technical report, 2011, Charles University, Faculty of Mathematics and Physics, Prague
Abstrakt: We propose a genetic algorithm for learning restricted variants of restarting automata from positive and negative samples. Experiments comparing the proposed genetic algorithm to algorithms RPNI and LARS on sample languages indicate that the new algorithm is able to infer a target language even from a small set of samples.
Odkazy: BibTeX, Fulltext
P. Černo, F. Mráz Delta-Clearing Restarting Automata and CFL
Proceedings of the DLT 2011 15th International Conference on Developments in Language Theory (Milano, Italy), Springer, Berlin, 2011, LNCS, Vol. 6795, 153-164.
Abstrakt: Delta-clearing restarting automata represent a new restricted model of restarting automata which, based on a limited context, can either delete a substring of the current content of its tape or replace a substring by a special auxiliary symbol Delta, which cannot be overwritten anymore, but it can be deleted later. The main result of this paper consists in proving that besides their limited operations, Delta-clearing restarting automata recognize all context-free languages.
Odkazy: BibTeX, Presentation, SpringerLink
P. Černo, F. Mráz Delta-Clearing Restarting Automata and CFL (Technical Report)
Technical report, 2011, Charles University, Faculty of Mathematics and Physics, Prague
Abstrakt: See above
Odkazy: BibTeX, Fulltext
I. Mrázová, Z. Reitermanová A New Sensitivity-Based Feature Selection Technique for Feed-Forward Neural Networks That Improves Generalization (Technical Report)
Technical report, 2011, Charles University, Faculty of Mathematics and Physics, Prague
Abstrakt: Multi-layer neural networks of the back-propagation type became already a well-established tool used successfully in various application areas. Efficient solutions to complex tasks currently dealt with obviously require sufficient generalization capabilities of the formed networks and an easy interpretation of their function. For this reason, we will introduce here a new feature selection technique called SCGSIR inspired by the fast method of scaled conjugate gradients (SCG) and sensitivity analysis.
Enforced internal knowledge representation supports an easy interpretation of the formed network structure. Network sensitivity inhibited during training impacts successful pruning of input neurons and optimization of network structure, too. Experiments performed so far on the problem of binary addition and on real data obtained from the World Bank yield promising results outperforming reference techniques when considering both their ability to find networks with and optimum architectures and generalization capabilities of the trained networks.
Odkazy: BibTeX, Fulltext
D. Pardubská, M. Plátek, F. Otto Parallel communicating grammar systems with regular control and skeleton preserving FRR-automata
Theoretical Computer Science 412 (2011) 458–477.
Abstrakt: Parallel communicating grammar systems with regular control (RPCGS, for short) are introduced, which are obtained from returning regular parallel communicating grammar systems by restricting the derivations that are executed in parallel by the various components through a regular control language. For the class of languages that are generated by RPCGSs with constant communication complexity we derive a characterisation in terms of a restricted type of freely rewriting restarting automaton. From this characterisation we obtain that these languages are semi-linear, and that for RPCGSs with constant communication complexity, the centralised variant has the same generative power as the non-centralised variant.
Odkazy: BibTeX, ScienceDirect

Posledná úprava: Sunday, 03/09/14