A contribution is devoted to the recent development of the MOSGRAF suite used to process Mössbauer data and to generate reference functions for the modern spectrometers of the MsAa-x (x=1, 2, 3, 4) family. Newly developed interface is designed for the Microsoft Windows-XP® or higher system of this class. However one can use some older system of this class as well provided some plug-ins are installed additionally. The lowest useable system is Microsoft Windows-98® - 32-bit edition. The new version of the MOSGRAF suite is compatible with 32-bit and 64-bit systems. MOSGRAF is fully compatible with MsAa-x spectrometers, however a powerful tool to convert ASCII data files from other spectrometers is provided. Data processing programs are compiled by the high efficiency Fortran-90 Lahey-Fujitsu® compiler. One of the most important new features is the ability to process velocity calibration data obtained by the new method based on the measurement of the time lapse between fringes of the Michelson-Morley interferometer.
In this article we present a novel method of hit time and hit position reconstruction in long scintillator detectors. We take advantage of the fact that for this kind of detectors amplitude and shape of registered signals depend strongly on the position where particle hits the detector. The reconstruction is based on determination of the degree of similarity between measured and averaged signals stored in a library for a set of well-defined positions along the scintillator. Preliminary results of validation of the introduced method with experimental data obtained by means of the double strip prototype of the J-PET detector are presented.
Graphene oxide suspension in various solvents was spin coated on metal organic vapor phase epitaxy grown GaN/saphire layers. Samples were characterised using the Raman spectroscopy and atomic force microscopy, before and after high temperature treatment. We found that graphene oxide was modifed by high temperature treatment, however a considerable modification was also observed as a result of impinged laser light incident due to the measurements. The Raman spectra were decomposed into two contributions showing different behaviour during the Raman scattering measurements.
New specialized sampling converter for signal timescale transformation and algorithm of digital signal processing for automated measurement of settling times of fast digital-to-analogue converters is represented. The new sampling device with numerically controlled oscillators allows us realization of different types of the timescale transformation. Usage of ΣΔ analogue-to-digital converter and first-in first-out memory allows us significant simplification of the device. The equations of timescale transformation ratio and sampling step are presented. A method using a brick-wall comb filter in frequency domain to filter measurement signal has been developed. It is shown that such filtering allows us significant reduction of noise of measurement signal. A new method and algorithm using a brick-wall comb filter and averaging of filtered signal has been developed. Results of the research of developed digital signal processing algorithm are submitted.
Identification of patterns in stock markets has been an important subject for many years. In the past, numerous techniques, both technical and econometric, were used to predict changes in stock markets, but dependences among all the companies listed on a stock market were considered in a limited extent. Numerous studies confirm that larger stocks items appear to influence smaller ones and that, on a global level, most of the world's stock markets are integrated. Therefore, this study implements the association rules using a data mining approach to explore the co-movement between stock items listed on the Warsaw Stock Exchange. We believe that in order to describe and to understand market's behavior, data mining techniques are more flexible in use than for instance pricing models based on a finance theory. The former seems to be more effective for explaining market behavior without making particular assumptions.
Interferometric measuring systems are frequently used in determining the precise changes. The light of the venture property is used in this system which is possible to perform measurements in the nanometer precision. Analysis of data, which are taken from interferometric measurement systems, can be performed by fringe counting, image processing. There are disadvantages of such kinds of methods which are cost, time and design challenge. The most innovation in our study, the need of electronic circuit or image processing algorithm for determining obtained values which are taken from measurement system as a result of processing part can be eliminated. In processing step, arrangement of the data is quite important in terms of the achieved results. Through data processing, it is provided to make faster analysis by the aid of increasing quantity of the data. The displacement values which are taken by data processing show 90% success when comparing them with both electronic and image processing techniques.
A method based on a simultaneous fit of several Mössbauer spectra from a series of measurements is compared with the one based on an independent analysis of each spectrum. Three different algorithms, namely a least squares iteration procedure, a Gauss-Seidel function minimization and a genetic algorithm based method are applied and discussed. The conclusions drawn are supported by a simulated and a measured spectra analysis.
The proposed article presents a new approach to analyze the relationships between financial instruments. We use blind signal separation methods to decompose time series into the core components. The components common to the various instruments provide broad set of characteristics to describe the internal morphology of the time series. In this research a modified and extended version of AMUSE algorithm is used. The concept is presented based on real financial instruments.
In this work we analyze empirically customer churn problem from a physical point of view to provide objective, data driven and significant answers to support decision making process in business application. In particular, we explore different entropy measures applied to decision trees and assess their performance from the business perspective using set of model quality measures often used in business practice. Additionally, the decision trees are compared with logistic regression and two machine learning methods - neural networks and support vector machines.
A generalized algorithm for building classification trees, based on Tsallis q-entropy, is proposed and applied to classification of Polish households with respect to their incomes. Data for 2008 are used. Quality measures for obtained trees are compared for different values of q parameter. A method of choosing the optimum tree is elaborated.
In this paper we present a novel similarity measure method for financial data. In our approach, we propose the assessment of the similarity in a coherent hierarchical and multi-faceted way, following the general scheme where various detailed basic measures may be used like the Fermi-Dirac divergence, Bose-Einstein divergence, or our new smoothness measure. The presented method is tested on benchmark and real stock markets data.
In this work, we study the structure of two-dimensional linear hybrid cellular automata with respect to adiabatic boundary condition. Further, we check the performance of hybrid cellular automata constructed through the members of this family in generating pseudo random bits.
The article presents independent component analysis (ICA) applied to the concept of ensemble predictors. The use of ICA decomposition enables to extract components with particular statistical properties that can be interpreted as destructive or constructive for the prediction. Such process can be treated as noise filtration from multivariate observation data, in which observed data consist prediction results. As a consequence of the ICA multivariate approach, the final results are combination of the primary models, what can be interpreted as aggregation step. The key issue of the presented method is the identification of noise components. For this purpose, a new method for evaluating the randomness of the signals was developed. The experimental results show that presented approach is effective for ensemble prediction taking into account different prediction criteria and even small set of models.
A family of one-dimensional finite linear cellular automata with reflective boundary condition over the field Z_p is defined. The generalizations are the radius and the field that states take values. Here, we establish a connection between reversibility of cellular automata and the rule matrix of the cellular automata with radius three. Also, we prove that the reverse CA of this family again falls into this family.
In this paper, we study 2-dimensional finite cellular automata defined by hexagonal local rule with periodic boundary over the field Z_3. We construct the rule matrix corresponding to the hexagonal cellular automata. For some given coefficients and the number of columns of hexagonal information matrix, we prove that the hexagonal cellular automata are reversible.
This paper investigates the theoretical aspects of two-dimensional linear cellular automata with image applications. We consider geometrical and visual aspects of patterns generated by cellular automata evolution. The present work focuses on the theory of two-dimensional linear cellular automata with respect to uniform periodic and adiabatic boundary cellular automata conditions. Multiple copies of any arbitrary image corresponding to cellular automata find so many applications in real life situation e.g. textile design, DNA genetics research, etc.
Positron annihilation lifetime spectroscopy has shown to be a powerful tool to study the nanostructures of porous materials. Positron emission tomograph is a device allowing imaging of metabolic processes e.g. in human bodies. A newly developed device, the Jagiellonian PET will allow positron annihilation lifetime spectroscopy in addition to imaging, thus combining both analyses providing new methods for physics and medicine. In this contribution we present a computer program that is compatible with the Jagiellonian PET software. We compare its performance with the standard program LT 9.0 by using positron annihilation lifetime spectroscopy data from hexane measurements at different temperatures. Our program is based on an iterative procedure, and our fits prove that it performs as good as LT 9.0.
We investigate main theoretical aspects of two-dimensional linear-hybrid cellular automata with periodic boundary condition over the Galois field GF(2). We focus on the characterization of two-dimensional hybrid linear cellular automata by way of a special algorithm. Here we set up a relation between reversibility of cellular automata and characterization of two-dimensional hybrid linear cellular automata with a special boundary conditions, i.e. periodic case. The determination of the characterization problem of special type of cellular automaton is studied by means of the matrix algebra theory. It is believed that this type of cellular automata could find many different applications in special case situations, e.g. image processing area, textile design, video processing, DNA research, etc., in the near future.
In the solar energy application, there are several obstacles during the design of projects. The investors are still enforcing with deliberately carrying out the right and feasible project decision. In this point simulation techniques have been emerging their indispensable importance. Thereby the simulation can be used as an adaptive purpose not only during the design process of a plant but also for the existing plants efficacy while determining operational condition of solar system on a daily basis by probabilistic methods due to random characteristics of the meteorological data. This study mainly consists two parts. The first part of this study overarches the review of existing software tools and models which are reachable at the preparation time of this study. Then the reviewed software tools and models were classified briefly in the figures. The second part of the study outlines the developed new code in C# computer programming language for the solar calculations. The results which were obtained framed the developed code, were evaluated with the convenient existing software tools and models for assessment of their correlation and also the results were demonstrated with graphs for their utilization. Consequently the software systems were enlightened that are mostly used in scientific projects including solar software simulation packages and modeling background together with comparison of separate or combined PV software programs by their field of application which would bolster further research studies.
A computer program was developed for studying transferred nuclear Overhauser effects in complex spin systems. It permits quantitative analysis of nuclear Overhauser effects observed in biologically important systems, such as ligands interacting with transmembrane receptors in the presence of lipid bilayers. The full generalized relaxation matrix approach takes into account the local mobility, spin equivalence, finite exchange rates, and spectral overlap. The program can be used either to simulate theoretical nuclear Overhauser effect buildup curves or to fit a relaxation matrix of a given model to experimental data. Selected examples illustrate the program's performance.
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.