[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
For Reviewers::
Registration::
Contact us::
Site Facilities::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
:: Search published articles ::
Showing 3 results for Dehghan

Razieh Dehghanian, Rahim Chinipardaz, Behzad Mansouri,
Volume 18, Issue 2 (3-2014)
Abstract

Classical methods in discrimination such as linear and quadratic do not have good efficiency in the case of nongaussian or nonlinear time series data. In nonparametric kernel discrimination in which the kernel estimators of likelihood functions are used instead of their real values has been shown to have good performance. The misclassification rate of kernel discrimination is usually less than linear and quadratic methods because of its flexibility. However, the kernel estimates are depend on the bandwidth. This paper is concerned with the selection of bandwidth parameter to achieve an optimal discrimination with minimum rate misclassification. The methods obtained bandwidth examined via a simulation study. 

Sakineh Dehghan, Mohammadreza Farid-Rohani,
Volume 20, Issue 1 (4-2015)
Abstract

In this article, first, we introduce depth function as a function for center-outward ranking. Then we present and
use half space or Tukey depth function as one of the most popular depth functions. In the following, multivariate
nonparametric tests for location and scale difference between two population are expressed by ranking and statistics
based on depth versus depth plot. Finally, according to these tests, performance of the suggested non-invasive
distraction method for pain intensity, life quality, operative ability and inflation rate is evaluated for osteoarthritic
and is compared with usual invasive distraction method.


Manije Sanei Tabas, Mohammadhosein Dehghan, Fatemeh Ashtab,
Volume 28, Issue 1 (9-2023)
Abstract

Variance and entropy are distinct metrics that are commonly used to measure the uncertainty of random variables. While the variance shows how a random variable spreads more than expected, the entropy measure measures the uncertainty of an information approach, in other words, it measures the average amount of information of a random variable.
 For both uniform and normal distributions, variance is a ratio of power entropy. Finding such a monotonic relationship between variance and entropy for a larger class of these two distributions is very important and useful in signal processing, machine learning, information theory, probability and statistics, for example, it is used to reduce the errors of estimators and choose a strategy. gives, on average, the greatest or nearly greatest reduction in the entropy of the distribution of the target location, and the effectiveness of this method is tested using simulations with mining assay models. In this article, the upper bound of the variance for single-mode distributions whose tails are heavier than the tails of exponential distributions is created with the help of power entropy


Page 1 from 1     

مجله اندیشه آماری Andishe _ye Amari
Persian site map - English site map - Created in 0.07 seconds with 25 queries by YEKTAWEB 4645