About Us   |   Submission   |   Editorial Team   |   Privacy Statement    |   Current Issue   |   Templates & Author Guidelines   |   Archives   |   Contacts

 

   

 

 

 

 

 

 

 

 

 

 

 

 

Safety Research :

 

 

Safety, Risk, Reliability and Quality:

 

 

Statistic, Probability and Uncertainty :

 

 

 

 

 


About the Teacher. Alexander D. Soloviev

 

Viktor Kashtanov

Moscow Institute of Applied Mathematics,
National Research University "Higher School of Economics"
(Moscow, Russia)
,
VAKashtan@yandex.ru  
 

 

The article talks about a remarkable man and an outstanding scientist – _Alexander Dmitrievich Soloviev. He was Doctor of Physics and Mathematics, Professor, Laureate of the State Prize of the USSR, Professor of the Probability Theory Department of the Faculty of Mechanics and Mathematics at Lomonosov Moscow State University. Alexander Dmitrievich lived an amazing creative life that can serve as an example for modern researchers. Victor Kashtanov, his student, shares his recollections and reflections on some episodes in the life of his teacher and friend.

 

Key words: memories, Soloviev, mathematicians, reliability assessment, redundancy

 

 

Cite: Kashtanov, V. About the Teacher. Alexander D. Soloviev. Reliability: Theory & Applications. 2022December 4(71): 22-34. https://doi.org/10.24412/1932-2321-2022-471-22-34

 


 

22-34

 

Failure Criteria and Time over Thresholds in Them

 

Victor Netes

Moscow Technical University of Communications

and Informatics, Russia, v.a.netes@mtuci.ru  

 

A failure is one of the key concepts in dependability. Therefore, it is very important to distinguish whether a failure has occurred or not. To do this, a failure criterion is formulated. This article describes main approaches to determining failure criteria. Special attention is paid to the parametric approach, in which a failure is an event when one of the parameters characterizing the functioning of an item goes beyond the specified limits. In addition, a time over threshold can also be set. This means that short-term disruptions in item’s operation are not considered as failures. The meaning of setting such a threshold is explained and examples of its use in telecommunications are given. For a parallel system with a time over threshold in a failure criterion, calculation formulas for dependability measures are derived. The errors that the use of traditional formulas gives in this situation are estimated.

 

Keywords: failure criterion, parametric approach, time over threshold, parallel system, MTBF, MTTR, availability

 

 

Cite: Netes, V. Failure Criteria and Time over Thresholds in Them. Reliability: Theory & Applications. 2022December 4(71): 35-42. https://doi.org/10.24412/1932-2321-2022-471-35-42

 


35-42

 

Analysis of Risks in the Modelling of Material Consumption Trends in the Production Process

 

Alena Breznická, Ľudmila Timárová, Beáta Kopiláková 

Faculty of Special Technology, Alexander Dubček

University of Trenčín, Ku kyselke 469, 911 06, Trenčín, Slovakia
alena.breznicka@tnuni.sk, ludmila.timarova@tnuni.sk,

beata.kopilakova@tnuni.sk

 

Quantitative risk analysis approaches in today's technologically advanced age represent a suitable process for mathematical investigation, revealing the context of the origin and existence of risks and their possible effects on ensuring reliability. Today, manufacturing, and industrial companies, with the growing pressure of globalization, must deal with vast amounts of data that evaluate various processes in maintenance management, warehouse and inventory management, or quality evaluation processes. One way to ensure objective collection, analysis and evaluation of robust data is to use Bootstrapping principles and modules. Many companies use these tools and are now becoming available to a wider range of users. Bootstrap principles, with which it is possible to enter the calculation of robust estimates, e.g., standard errors and confidence intervals based on the bootstrap method is therefore suitable for estimating statistics such as mean, median, correlation coefficient or regression coefficients. In this article, we will take a closer look at what bootstrapping is, show you how to enter the calculation of bootstrap estimates, and what types of output are then displayed. Logistic forecasting of spare parts with sporadic consumption are difficult because of problems associated with obtaining data inscrutable demand, which is usually characterized by long periods of zero demand. The presented contribution presents the possibilities of using the method, which is the starting point for the stochastic forecast of future consumption. Based on this method, we can determine the minimum order stock level. The results of the simulations are also presented in graphical outputs.

 

Keywords: bootstrap, simulation, inventory management

 

 

Cite: Breznická, A., Timárová, L., Kopiláková, B. Analysis of risks in the modelling of material consumption trends in the production processReliability: Theory & Applications. 2022December 4(71): 43-49. https://doi.org/10.24412/1932-2321-2022-471-43-49

 


43-49

 

Methodical rationale of system solutions to reduce risks and retain them within acceptable limits for knowledge management process

 

Andrey Kostogryzov

Federal Research Center “Computer Science and Control”

of the Russian Academy of Sciences, Moscow, Russia,

e-mail: Akostogr@gmail.com

 

Roman Avdonin
Federal Research Center “Computer Science and Control”

of the Russian Academy of Sciences, Moscow, Russia,

e-mail: ft.99@yandex.ru
 

Andrey Nistratov

Federal Research Center “Computer Science and Control”

of the Russian Academy of Sciences, Moscow, Russia,

e-mail: andrey.nistratov@gmail.com  

 

An approach to the formalization of the standard knowledge management process is proposed, taking into account the requirements for information protection. The approach has been developed to the level of methodical approach for estimation and rationale system solutions to reduce risks and/or retain risks within acceptable limits for various threats scenarios. The use of the approach allows to estimate the impact of various threats on knowledge management process performance by probabilistic measures (including threats to the violation of information protection requirements). The usability of the proposed methodical approach is demonstrated by examples.

 

Keywords: analysis, engineering, information protection, knowledge, model, prediction, risk, system

 

 

Cite: Kostogrryzov, A., Avdonin, R., Nistratov, A. Methodical rationale of system solutions to reduce risks and retain them within acceptable limits for knowledge management process. Reliability: Theory & Applications. 2022December 4(71): 50-64. https://doi.org/10.24412/1932-2321-2022-471-50-64

 


50-64

 

Improving Dijkstra’s algorithm for Estimating Project Characteristics and Critical Path

 

Adilakshmi Siripurapu

Dept. of Basic Science and Humanities, Vignan’s Institute of Information
Technology (A), Duvvada, Visakhapatnam, AP, India, laxmimaths2008@gmail.com

 

Ravi Shankar Nowpada

Dept. of Mathematics, Institute of Science, GITAM (Deemed to be University),
Visakhapatnam, AP, India, Drravi68@gmail.com

 

K. Srinivasa Rao 

Dept. of Operations, GITAM school of Business, GITAM (Deemed to be University),
Visakhapatnam, AP, India, skolli2@gitam.edu

 

Developing a project planning structure for all industries is a technological challenge involving evaluating several restrictions for each activity’s respective task and its planning tools. Any restriction affects the completion time, operating costs, and overall project performance. Programme Evaluation Review Technique (PERT) and Critical Path Method (CPM) processes made many researchers study the possible ways of finding the critical paths and activities in the network. The advancement of the CPM and PERT towards a probabilistic environment is still a long way off. However, Artificial intelligence approaches such as the Genetic Algorithm, Dijkstra’s algorithm, and others are utilized for network analysis within the project management framework. This study is to help the project manager plan schedule for a construction project to determine the expected completion time. In this research paper, we describe a method for obtaining the earliest and latest times of a critical path using modified Dijkstra’s algorithm with triangular fuzzy numbers. Forward pass and backward pass algorithms are designed to find the optimal path for the proposed method. Numerical examples are also illustrated for the same. Simulation results are included by the use of the “C” program. Finally, a comparison is made with the traditional method PERT.

 

Keywords: Critical Path, Dijkstra’s Algorithm, Earliest and latest times, modified Dijkstra’s algorithm, PERT.

 

 

Cite: Adilakshmi Siripurapu, Ravi Shankar Nowpada, K. Srinivasa Rao Improving Dijkstra’s algorithm for Estimating Project Characteristics and Critical PathReliability: Theory & Applications. 2022December 4(71): 65-73. https://doi.org/10.24412/1932-2321-2022-471-65-73

 


65-73

 

MLE OF A 3-PARAMETER GAMMA DISTRIBUTION ANALYSIS OF RAINFALL INTENSITY DATA SETS

 

David, I.J., Adubisi, D.O., Ogbaji, O.E., Ikwuoche, O.P.  

Department of Mathematics and Statistics,

Federal University Wukari, Nigeria
davidij@fuwukari.edu.ng,

adubisiobinna@fuwukari.edu.ng,

ogbajieka@yahoo.com,

philonye10@gmail.com  

 

Adehi, U.M.

Department of Statistics, Nasarawa State

University, Keffi, Nigeria
maryadehi@yahoo.com

 

This research presents the maximum likelihood estimation of a three-parameter Gamma distribution with application to four types of average rainfall intensities in Nigeria. These data sets are average half-yearly, yearly, quarterly and monthly rainfall intensities. The fitted three-parameter Gamma is compared to a two-parameter Gamma distribution using empirical distribution function (EDF) tests. The tests used are Cramér-von Mises, Anderson-Darling and Kolmogorov-Smirnov statistics. Based on the results obtained at 10% significance level both the two-parameter and three-parameter Gamma distributions are of good fit to only the average yearly rainfall intensity data. A kernel density plot revealed that the average half-yearly, quarterly and monthly rainfall intensity data sets are multi-modal in nature hence a reason for both Gamma distributions poor fit to the data sets. Also, the PDF, CDF and Q-Q plots are presented which supported the outcome of the analysis.

 

Keywords: Gamma distribution, Anderson-Darling, Cramér-von Mises, Kernel density, Kolmogorov-Smirnov, Maximum likelihood estimation

 

 

Cite: David.I.J., Adubisi.D.O., Ogbaji.O.E., Adehi.U.M., Ikwuoche.O.P. MLE OF A 3-PARAMETER GAMMA DISTRIBUTION ANALYSIS OF RAINFALL INTENSITY DATA SETSReliability: Theory & Applications. 2022December 4(71): 74-86. https://doi.org/10.24412/1932-2321-2022-471-74-86

 


74-86

 

Designing of Inventory Management for Determining the Optimal Number of Objects at the Inventory Grouping Based on ABC Analysis

 

K. Srinivasa Rao

Dept. of Operations, GITAM school of Business,

GITAM (Deemed to be University),
Visakhapatnam, AP, India,

skolli2@gitam.edu

 

R. Venu Gopal

Dept. of Marketing, GITAM school of Business,

GITAM (Deemed to be University),
Visakhapatnam, AP
, India,

vreddi@gitam.edu

 

Adilakshmi Siripurapu
Dept. of Basic Science and Humanities,

Vignan’s Institute of Information Technology (A),
Duvvada, Visakhapatnam, AP, India.

slakshmijagarapu@gmail.com

 

The most appropriate procedures in the inventory organization area are inventory arrangements based on ABC investigation, a well-known technique for establishing the objects in a different collection, giving their status and principles. This research Bi- A mathematical goal to advance the inventory group founded on the ABC. The Planned model instantly improves the amenity level, the amount of inventory grouping, and the number of due things. An Arithmetical model is available in this study to categorize inventory objects, considering significant revenue and rate decrease catalogues. The model aims to maximize the net gain of available items. Economic and inventory constraints are also taken into account. The Benders decay and Lagrange reduction procedures respond to classical arithmetical stands. The outcomes of the two answers are then equated. TOPSIS and numerical examinations estimate the planned answers and choose the best. Later, numerous sensitivity studies on the classic were completed, which assists inventory control executives in regulating the outcome of inventory administration rates configured for optimum verdict production and element grouping. The Arithmetical diagram was run for ten different arithmetic instances, and the results of the two suggested explanations were statistically equated using a t-test. As a result, the TOPSIS technique was appropriate; the Lagrangean approach was chosen as the more fabulous technique.

 

Keywords: ABC analysis, Bi-Goal optimization, inventory control, decomposition procedures, TOPSIS

 

 

Cite: K. Srinivasa Rao, R. Venu Gopal, Adilakshmi Siripurapu Designing of Inventory Management for Determining the Optimal Number of Objects at the Inventory Grouping Based on ABC Analysis. Reliability: Theory & Applications. 2022December 4(71): 87-97. https://doi.org/10.24412/1932-2321-2022-471-87-97

 


87-97

 

BAYESIAN INTERVAL ESTIMATION FOR THE PARAMETERS OF POISSON TYPE RAYLEIGH CLASS MODEL

 

Rajesh Singh

Department of Statistics, S. G. B.

Amravati University, Amravati, India
rsinghamt@hotmail.com
 

Preeti A. Badge
Department of Statistics, S. G. B.

Amravati University, Amravati, India
preetibadge10@gmail.com
 

Pritee Singh 
Department of Statistics, Institute

of Science, Nagpur, India.
priteesingh25@gmail.com
 

 

In this article, two-sided Bayesian interval is proposed for the parameters of Poisson type Rayleigh class software reliability growth model. In this work, the failure intensity function, mean time to failure function and likelihood function of this model have been derived by considering parameter total number of failures and scale parameter. The mathematical expressions of Bayesian interval for the parameters have been obtained by considering non informative priors. The performance of proposed Bayesian interval is studied on the basis of average length and coverage probability. Average length and coverage probability is obtained by using Monte Carlo simulation technique after generating 1000 random samples. From the obtained results, it is concluded that Bayesian interval of parameters perform better for appropriate choice of execution time and certain values of parameters.

 

Keywords: Rayleigh distribution, Non informative prior, Software reliability growth model, Bayesian interval, average length, coverage probability.

 

 

Cite: Rajesh Singh, Preeti A. Badge, Pritee Singh BAYESIAN INTERVAL ESTIMATION FOR THE PARAMETERS OF POISSON TYPE RAYLEIGH CLASS MODEL. Reliability: Theory & Applications. 2022December 4(71): 98-108. https://doi.org/10.24412/1932-2321-2022-471-98-108

 


98-108

 

AN INFERENTIAL STUDY OF DISCRETE BURR-HATKE EXPONENTIAL DISTRIBUTION UNDER COMPLETE AND CENSORED DATA

 

Arvind Pandey, Ravindra Pratap Singh, Abhishek Tyagi  

Department of Statistics, Central University

of Rajasthan, Rajasthan-305817, India
Department of Statistics, Chaudhary

Charan Singh University, Meerut-250004, India
arvindmzu@gmail.com,

stats.rpsingh@gmail.com,

abhishektyagi033@gmail.com

 

In this article, a new one-parameter discrete distribution called discrete Burr-Hatke exponential distribution is introduced and its mathematical characteristics are thoroughly investigated. The proposed distribution is capable of modelling over-dispersed, positively skewed, decreasing failure rate, and randomly right-censored data. We have also introduced many statistical properties including moments, skewness, kurtosis, mean residual life and mean past lifetime, index of dispersion, coefficient of variation, stress strength parameter, quantile function, and order statistics. Method of maximum likelihood is used to estimate unknown model’s parameter under complete and censored data. In addition, a technique for generating randomly right-censored data from the proposed model is provided. To evaluate the behaviour of the estimator with complete and censored data, two simulation studies are presented. Two complete and two censored datasets from various disciplines are studied to demonstrate the significance of the suggested distribution in comparison to the existing discrete probability distributions.

 

Keywords: Burr-Hatke exponential distribution, Method of maximum likelihood, Discrete distribution, Random censoring, Simulation study

 

 

Cite: Arvind Pandey, Ravindra Pratap Singh, Abhishek Tyagi AN INFERENTIAL STUDY OF DISCRETE BURR-HATKE EXPONENTIAL DISTRIBUTION UNDER COMPLETE AND CENSORED DATA. Reliability: Theory & Applications. 2022December 4(71): 109-122. https://doi.org/10.24412/1932-2321-2022-471-109-122

 


109-122

 

CONSTRUCTION AND SELECTION OF SKIP LOT SAMPLING PLAN OF TYPE SKSP-V FOR LIFE TESTS BASED ON PERCENTILES OF EXPONENTIATED RAYLEIGH DISTRIBUTION

 

P. Umamaheswari, ,  

Assistant Professor,

Sona College of Arts & Science
uma_m2485@yahoo.com


K. Pradeepa Veerakumari
Assistant Professor, Department

of Statistics, Bharathiar University,

Coimbatore, Tamil Nadu, India
pradeepaveerakumari@buc.ac.in
 

S. Suganya
Assistant Professor, Department

of Statistics, PSG College of Arts & Science,

Coimbatore, Tamil Nadu, India
suganstat@gmail.com  

 

This study uses percentiles under the exponentiated Rayleigh distribution to build a skip lot sampling plan of the SkSP-V type for a life test. A truncated life test may be carried out to determine the minimum sample size to guarantee a specific percentage lifetime of products. In particular, this paper highlights the construction of the Skip lot Sampling Plan of the type SkSP-V by considering the Singe Sampling Plan as reference plans for life tests based on percentiles of Exponentiated Rayleigh Distribution. Calculations are made for various quality levels to determine the minimum sample size, prescribed ratio, and operational characteristic values. The proposed sampling plan, which is appropriate for the manufacturing industries for the selection of samples, is also analyzed in terms of its parameters and metrics. The curve is produced after tabulating the operating characteristic data of the plan. Illustrations are provided to help you comprehend the plan. In addition, it addresses the feasibility of the new strategy.

 

Keywords: Exponentiated Rayleigh Distribution, Percentiles, Life tests, Single Sampling Plan, Double Sampling Plan, SkSP –V.

 

 

Cite: P. Umamaheswari, K. Pradeepa Veerakumari, S. Suganya CONSTRUCTION AND SELECTION OF SKIP LOT SAMPLING PLAN OF TYPE SKSP-V FOR LIFE TESTS BASED ON PERCENTILES OF EXPONENTIATED RAYLEIGH DISTRIBUTION. Reliability: Theory & Applications. 2022December 4(71): 123-131. https://doi.org/10.24412/1932-2321-2022-471-123-131

 


123-131

 

STOCHASTIC ANALYSIS OF A COLD STANDBY COMPUTER SYSTEM WITH UP-GRADATION PRIORITY AND FAILURE OF SERVICE FACILITY

 

R. K. Yadav, N. Nandal, S.C. Malik 

Department of Statistics, M.D. University, Rohtak
yadav.ramesh546@gmail.com,
nsinghnandal@gmail.com,
sc_malik@rediffmail.com

 

We describe the development of a stochastic model for a computer system with cold standby redundancy, priority and failure of service facility. A computer system (called a single unit) means the simultaneous working of its hardware and software components. The system has one more unit (called computer system) that can be used as and when required at the failure of any of the hardware/software components of the initially operative computer system. A single repair facility is made available to rectify the faults which occur due to the failure of hardware and software components. The failed hardware component undergoes for repair immediately while failed software is up-graded. The service facility is subjected to failure during hardware repair. The provision of perfect treatment has been made for the failed service facility. The components work as new after repair and up-gradation with the same life time distribution. The priority is given to the software up-gradation over the hardware repair. In steady state, the expressions for some important reliability measures have been derived using the well known semi-Markov process and regenerative point technique. The behavior of some useful reliability characteristics has been observed for particular values of the parameters related to failure times, repair and up-gradation times and treatment time which follow negative exponential distribution.

 

Keywords: Computer System, Unit Wise Redundancy, Priority, Failure of Service Facility and Stochastic Modelling

 

 

Cite: R. K. Yadav, N. Nandal, S.C. Malik STOCHASTIC ANALYSIS OF A COLD STANDBY COMPUTER SYSTEM WITH UP-GRADATION PRIORITY AND FAILURE OF SERVICE FACILITY. Reliability: Theory & Applications. 2022December 4(71): 132-142. https://doi.org/10.24412/1932-2321-2022-471-132-142

 


132-142

 

M/M/∞ Queue with Catastrophes and Repairable Servers

 

Gulab Singh Bura  

Department of Mathematics and Statistics,

Banasthali Vidyapith, Rajasthan, INDIA
gulabsingh@banasthali.in

 

An infinite server Markovian queueing system with randomly occurring breakdowns and non zero exponentially distributed repair time is proposed. Upon arrival, a catastrophes deactivate all the servers and system is under catastrophic failure. Immediately, a repair process is started and after successful repair the system is ready to serve the newly arrived customers. Continued fraction techniques have been used to obtain the time dependent probabilities of the studied model. The stationary probability distribution for the number of customers in the system is also derived. Some important stationary as well as transient moments are also determined. Further, The availability and reliability of the system under consideration are investigated. Finally, some graphical results are presented to visualize the model practically.

 

Keywords: M/M/∞ Queue, Server Breakdown, Transient Analysis,Steady State Solution, Confluent Hypergeometric Function, Reliability and Availability.

 

 

Cite: Gulab Singh Bura M/M/∞ Queue with Catastrophes and Repairable Servers. Reliability: Theory & Applications. 2022December 4(71): 143-153. https://doi.org/10.24412/1932-2321-2022-471-143-153

 


143-153

 

A NEW ALGORTHIM TO SOLVE FUZZY TRANSPORTATION MODEL WITH L-R TYPE HEXAGONAL FUZZY NUMBERS USING RANKING FUNCTION

 

CH. Uma Swetha

Anil Neerukonda Institute of Science &

Technology, Visakhapatnam, India
umaswethachitta@gmail.com
 

N. Ravishankar

Gitam Deemed to be university, GSS,

Visakhapatnam, India
drravi68@gmail.com
 

Indira Singuluri 

Vignan’s Institute of Information

Technology (A), Duvvada,

Visakhapatnam, India
indira.singuluri@gmail.com

 

The transportation problems have much utilization in logistics and supply chains for minimizing costs. In real life circumstances, the limitations of transportation models may not be known absolutely because of unmanageable elements. In the several research papers the transportation costs, availability and demands of the commodity are shown as general fuzzy numbers and L-R flat fuzzy numbers for minimizing the transportation cost using different algorithms. But in this article, proposed the fuzzy costs, supply, and demands of the commodity at origins and destinations are taken as L-R type hexagonal fuzzy numbers for obtaining the optimal solution of unbalanced and balanced fuzzy transportation model by using ranking function to get minimum transportation cost. Here in, the numerical examples are also included. It is very simple to express and execute in real world transportation problem for decision maker.

 

Keywords: L-R HFN’s, ranking function, Transportation problem, balanced transportation problem and unbalanced transportation problem

 

 

Cite: CH. Uma Swetha, N. Ravishankar, Indira Singuluri A NEW ALGORTHIM TO SOLVE FUZZY TRANSPORTATION MODEL WITH L-R TYPE HEXAGONAL FUZZY NUMBERS USING RANKING FUNCTION. Reliability: Theory & Applications. 2022December 4(71): 154-163. https://doi.org/10.24412/1932-2321-2022-471-154-163

 


154-163

 

Solving Bi-objective Assignment Problem under Neutrosophic Environment

 

S. Sandhiya, D. Anuradha

Department of Mathematics,

School of Advanced Sciences,

Vellore Institute of Technology,

Vellore, Tamil Nadu 632014, India.
sandhiyanive98@gmail.com,

anuradhadhanapal1981@gmail.com
 

The assignment problem (AP) is a decision-making problem that is used in production planning, industrial organizations, the economy and so on. As the single objective AP is no longer sufficient to handle today's optimization problems, bi-objective AP (BOAP) is considered. This research article introduces BOAP in neutrosophic environment. The neutrosophic BOAP (NBOAP) is formulated by adding the elements of cost matrices with single-valued trapezoidal neutrosophic numbers (SVTrNNs). A new method namely, fixing point approach (FPA) is proposed in this paper. The aim of this study is not only to determine the set of efficient solutions but also to find the optimal compromise solution for NBOAP using FPA. The proposed approach is elucidated with a numerical example and its solutions are plotted in a graph using MATLAB, which demonstrates its efficiency and optimality in practical aspects. This approach is more profitable for decision makers (DMs) and more efficient than other existing approaches because it provides the best optimal compromise solution in a neutrosophic environment.

 

Keywords: Bi-objective neutrosophic assignment problem, Hungarian method (HM), Fixing point approach, Ideal solution, Efficient solution, Optimal compromise solution.

 

 

Cite: S. Sandhiya, D. Anuradha Solving Bi-objective Assignment Problem under Neutrosophic Environment. Reliability: Theory & Applications. 2022December 4(71): 164-175. https://doi.org/10.24412/1932-2321-2022-471-164-175

 


164-175

 

Fuzzy Linear Programming Approach for Solving Production Planning Problem

 

Mahesh M. Janolkar

Department of First Year Engineering
Prof Ram Meghe College of Engineering

and Management Badnera-Amravati (MS India)
maheshjanolkar@gmail.com
 

Kirankumar L. Bondar
P. G. Department of Mathematics,

Govt Vidarbh Institute of Science

and Humanities, Amravati.
klbondar_75@rediffmail.com
 

Pandit U. Chopade
Research Supervisor, Department

of Mathematics, D.S.M’s Arts Commerce

and Science College ,Jintur.
chopadepu@rediffmail.com

 

One of the various optimization methods that addresses optimization under uncertainty is fuzzy linear programming. This model can be used when there is ambiguity in the situation because it is not precisely specified or when the problem does not require an exact value. With fuzzy linear programming, there is a range of grey between the two extremes as opposed to binary models, where an event may only be either black or white. As a result, it broadens the range of potential applications because most scenarios involve a spectrum of values rather than a bipolar state. In this article, a new FLP-based method is developed using a single MF, called modified logistics MF. The modified MF logistics and its modifications taking into account the characteristics of the parameter are from the analysis process. This MF was tested for useful performance by modeling using FLP. The developed version of FLP provides confidence in the existing IPPP application. This approach to resolving the IPPP can get feedback from the decision maker, the implementer and the analyst. In this case, this process can be called FLP interaction. FS self-assembly for MPS problems can be developed to find satisfactory solutions. The decision maker, researcher and practitioner can apply their knowledge and experience to get the best results.

 

Keywords: Fuzzy Linear Programming, Degree of Satisfaction, Production Planning, Fuzzy PF, Vagueness

 

 

Cite: Mahesh M. Janolkar, Kirankumar L. Bondar, Pandit U. Chopade Fuzzy Linear Programming Approach for Solving Production Planning Problem. Reliability: Theory & Applications. 2022December 4(71): 176-185. https://doi.org/10.24412/1932-2321-2022-471-176-185

 


176-185

 

Robust regression algorithms with kernel functions in Support Vector Regression Models

 

Muthukrishnan. R

Professor, Department of Statistics,

Bharathiar University, Tamil Nadu
muthukrishnan1970@gmail.com

 

Kalaivani. S 

Research Scholar, Department of Statistics,

Bharathiar University, Tamil Nadu
kalaivanistatistics1994@gmail.com

 

In machine learning, support vector machines (SVM) are supervised learning models with associated learning algorithms that analyze data for classification and regression analysis. SVM is one of the most robust prediction method based on statistical learning frameworks. Regression is a statistical method that attempts to determine the strength and character of the relationship between dependent and independent variables. This paper explores the idea of support vector Regression. The most commonly used classical procedure is Least Squares, which is less efficient and very sensitive when the data contains outliers. To overcome this limitations, alternative robust regression procedures exist such as LMS regression, S-estimator, MM-estimator and Support Vector Regression (SVR). In this study, the comparisons have made for the classical regression procedure and the robust regression procedures. In that, various measures of errors are much efficient when we work with robust regression procedures. In this paper, an attempt has been made to review the existing theory and methods of SVR.

 

Keywords: Linear regression, Robust regression, kernels, Support Vector Regression

 

 

Cite: Muthukrishnan. R , Kalaivani. S Robust regression algorithms with kernel functions in Support Vector Regression Models. Reliability: Theory & Applications. 2022December 4(71): 186-191. https://doi.org/10.24412/1932-2321-2022-471-186-191

 


186-191

 

The Seasonal Effect of Working Conditions of an Ice-cream Plant

 

Upasana Sharma, Drishti 

Department Statistics, Punjabi University,

Patiala- 147002, India
usharma@pbi.ac.in,

drish2796@gmail.com

 

An ice-cream plant’s workings are analyzed in the summer and winter seasons of the paper. The ice-cream unit along with the other three units i.e., flavoring, freezing and combined flavouring and freezing units are always operational in summers, due to the high demand, while in winters the combined flavoring and freezing unit is kept in cold standby as a backup in case there is a demand for ice-cream. In this work, the semi-Markov process and the regenerative point technique have been used to analyze the system. Numerical analysis has been conducted using MATLAB. A variety of measures have been developed to evaluate the effectiveness of a system. The Code Blocks have been used in interpreting the graph in the specific case presented. All evaluation is based on the milk production data collected by the plant. Improvements to the system performance will lead to increased profits. Similar techniques can be applied to other systems.

 

Keywords: Seasonal functioning, semi-Markov process, Regenerative point technique, profit

 

 

Cite: Upasana Sharma, Drishti The Seasonal Effect of Working Conditions of an Ice-cream Plant. Reliability: Theory & Applications. 2022December 4(71): 192-203. https://doi.org/10.24412/1932-2321-2022-471-192-203

 


192-203

 

 An Upgraded Approach to Solve Fuzzy Transportation Problems

 

Kaushik A Joshi

Department of First Year Engineering
Prof Ram Meghe College of Engineering

and Management Badnera-Amravati (MS India)
kaushikanandjoshi@gmail.com
 

Kirankumar L. Bondar
P. G. Department of Mathematics, Govt Vidarbh Institute

of Science and Humanities, Amravati.
klbondar_75@rediffmail.com
 

Pandit U. Chopade
Research Supervisor, Department of Mathematics,
D.S.M’s Arts Commerce and Science College ,Jintur.
chopadepu@rediffmail.com
 

 

TP has many applications and applications and applications to reduce costs. A good algorithm has been developed to adjust the TP in the context of all given parameters, namely the supply, demand and TC team one, well. However, in real applications, there are many different situations due to uncertainty. It is therefore important to study PT in an uncertain environment. In this paper, an updated procedure is proposed to fix FTP where all parameters represents the non-triangular FN. The first is to use a non-trivial assembly to convert FTP to an LP with FC and net resistance. The second is to use a new vending system to turn the problem-solving lab into a three-wire lab. The value of a well-updated system is assessed compared to existing systems from an application model. The results obtained show that the updated method proposed in this study is simpler and more efficient than some existing methods commonly used in literature.

 

Keywords: Fuzzy Transportation Problem, Fuzzy Numbers, Solid Transportation Problems, Linear Programming Problem.

 

 

Cite: Kaushik A Joshi, Kirankumar L. Bondar, Pandit U. Chopade An Upgraded Approach to Solve Fuzzy Transportation Problems. Reliability: Theory & Applications. 2022December 4(71): 204-217. https://doi.org/10.24412/1932-2321-2022-471-204-217

 


204-217

 

Minimax Estimation of the Scale Parameter of Inverse Rayleigh Distribution under Symmetric and Asymmetric Loss Functions

 

Proloy Banerjee, Shreya Bhunia

Department of Mathematics and Statistics,

Aliah University, Kolkata, India
proloy.stat@gmail.com,

shreyabhunia.stat@gmail.com 

 

In this article, minimax estimation of the scale parameter lambda of the inverse Rayleigh distribution is performed under symmetric (QLF) and asymmetric (SLELF and GELF) loss functions by applying the Lehmann’s theorem (1950). An extended Jeffrey’s prior and gamma prior are assumed to derive the minimax estimators under each of the considered loss functions. An extensive simulation study is carried out to compare the performance of the minimax estimators with the maximum likelihood (MLE), which is traditionally used as a classical estimator, on the basis of biases and mean squared errors (MSE). The obtained results suggest that under the assumption of extended Jeffrey’s prior, minimax estimators with positive c values are superior as compared to the MLE. Moreover, it is found that in most of the cases, minimax estimator under quadratic loss function (QLF) performs satisfactory on the assumption of gamma prior.

 

Keywords: Minimax estimator, squared log error loss function, quadratic loss function, general entropy loss function, extended Jeffrey’s prior, risk function

 

 

Cite: Proloy Banerjee, Shreya Bhunia Minimax Estimation of the Scale Parameter of Inverse Rayleigh Distribution under Symmetric and Asymmetric Loss Functions. Reliability: Theory & Applications. 2022December 4(71): 218-231. https://doi.org/10.24412/1932-2321-2022-471-218-231

 


218-231

 

The power continuous Bernoulli distribution: Theory and applications

 

Christophe Chesneau

Department of Mathematics, LMNO,

University of Caen, 14032 Caen, France.
christophe.chesneau@unicaen.fr

 

Festus C. Opone

Department of Statistics,

University of Benin, Benin City, Nigeria.
festus.opone@physci.uniben.edu

 

The continuous Bernoulli distribution is a recently introduced one-parameter distribution with support [0, 1], finding numerous applications in applied statistics. The idea of this article is to propose a natural extension of this distribution by adding a shape parameter through a power transformation. We introduce the power continuous Bernoulli distribution, aiming to extend the modeling scope of the continuous Bernoulli distribution. Basics of its mathematical properties are derived, such as the shapes of the related functions, the determination of various moment measures, and an evaluation of the overall amount of its randomness via the Rényi entropy. A statistical analysis of the distribution is then performed, showing how it can be applied when dealing with data. Estimates of the parameters are discussed through the maximum likelihood method. A Monte Carlo simulation study investigates the asymptotic behavior of these estimates. The flexibility of the power continuous Bernoulli distribution in real-life data fitting is analyzed using two data sets. Also, fair competitors are considered to highlight the accuracy of this distribution. At all stages, numerous graphics and tables illustrate the findings.

 

Keywords: Continuous Bernoulli distribution; moments; quantiles; entropy; data fitting

 

 

Cite: Christophe Chesneau, Festus C. Opone The power continuous Bernoulli distribution: Theory and applications. Reliability: Theory & Applications. 2022December 4(71): 232-248. https://doi.org/10.24412/1932-2321-2022-471-232-248

 


232-248

 

Stress-strength Reliability for Equi-correlated Multivariate Normal and its estimation

 

Anirban Goswami

Regional Research Institute of

Unani Medicine, Patna, Bihar
anirbangoswami09@gmail.com

 

Babulal Seal

Department of Mathematics and Statistics,

Aliah University, Kolkata, West Bengal
babulal_seal@yahoo.com

 

In this article it is mainly focused on discussion about estimation of stress-strength reliability under equi-correlated multivariate setup. It is seen in some situations that the components of a system are equi-correlated. Generally, the form of the equi-correlation structure within the components of a system is known for a given situation, however parameters that are involved in the equi-correlation structure always unknown. In this article, we propose a procedure to compute and estimate the stressstrength reliability R= Pr(a'x > b'y) when x and y are distributed non-independently equicorrelated multivariate normal distribution, where a and b are two known vectors. Here we have proposed the method of moments estimator to estimate these unknown parameters. Actually, we want to find out overall strength is larger than overall stress. In order to do that we take a'x and b'y as their representatives e.g. principal components of the respective vectors do the job approximately. An asymptotic distribution used to obtain confidence intervals for the stress-strength reliability. The performance of these intervals checked through the simulation study. Finally, we provide a real data analysis.

 

Keywords: Equi-correaled; Principal Component, Method of Moments Estimator (MOM); Asymptotic.

 

 

Cite: Anirban Goswami, Babulal Seal Stress-strength Reliability for Equi-correlated Multivariate Normal and its estimation. Reliability: Theory & Applications. 2022December 4(71): 249-267. https://doi.org/10.24412/1932-2321-2022-471-249-267

 


249-267

 

RELIABILITY ANALYSIS FOR GDC SYSTEM USING REPAIR AND REPLACEMENT FACILITY IN PISTON FOUNDRY PLANT

 

Raman Gill, Upasana Sharma 

Department of Statistics,

Punjabi University,Patiala
ramanpbi.ac.in@gmail.com,

usharma@pbi.ac.in

 

The system in industries is greatly impacted by failure. Eliminating these defects is therefore essential for enhancing system performance. This study aims to assess the range of repair/replacement facilities in the GDC (Gravity Die Casting) system at the Piston Foundry Plant. Two sub-units are connected to one main unit, which makes up the GDC system. Any component failure results in system failure. In this situation, the system will first attempt to be repaired, and if that is unsuccessful, it will be replaced. To operate effectively, the primary unit needs to be built of aluminium alloy (Al). Lack of raw materials is what leads to a system failing. Using semi-Markov processes and the regenerating point method, the aforementioned measurements were computed numerically and graphically. The results of this study are unusual since no prior research has concentrated on the GDC system repair/replacement facilities at piston foundries. The conclusions, according to the discussion, are very helpful for businesses who manufacture pistons and utilise the GDC system.

 

Keywords: GDC, repair, replacement, semi-Markov process, regenerating point technique.

 

 

Cite: Raman Gill, Upasana Sharma RELIABILITY ANALYSIS FOR GDC SYSTEM USING REPAIR AND REPLACEMENT FACILITY IN PISTON FOUNDRY PLANT. Reliability: Theory & Applications. 2022December 4(71): 268-281. https://doi.org/10.24412/1932-2321-2022-471-268-281

 


268-281

 

Comparison of Bridge Systems with Multiple Types of Components

 

Garima Chopra, Deepak Kumar

Department of Mathematics, University

Institute of Engineering & Technology, Maharshi
Dayanand University, Rohtak, Haryana, India
garima.chopra@gmail.com,

dkkr111@gmail.com

 

This paper aims to compare some bridge systems with multiple types of components in stochastic, hazard rate, and likelihood ratio order. Such systems are generally used in the designing and production industries. These systems are supported by a buffer store that balances the fluctuation in two production lines during the production process. The survival signature tool and distortion function technique are employed to compare the performance of four different bridge systems. Survival signature and henceforth survival function is computed for each considered system. The findings of comparisons are facilitated with the help of tables and figures. The comparison of large size coherent systems based on the structure-function approach is quite challenging. As this study is based on survival signature, so it is not so complex and has future scope.

 

Keywords: survival signature; bridge system; survival function; distortion function

 

 

Cite: Chopra, G., Kumar D. Comparison of Bridge Systems with Multiple Types of Components. Reliability: Theory & Applications. 2022December 4(71): 282-296. https://doi.org/10.24412/1932-2321-2022-471-282-296

 


282-296

 

Classical and Bayesian Estimation of Parameter of SSE(e)-distribution Under Type-II Censored Data

 

P. Kumar

Department of Statistics, Faculty

of Science and Technology,
Mahatma Gandhi Kashi Vidyapith,

Varanasi, India-221002
 

D. Kumar, U. Singh 

Department of Statistics, Banaras

Hindu University, Varanasi, India-221005
 

P. Kumar

Department of Statistics, Udai Pratap

Autonomous College, Varanasi, India-221003
pawanchauhanstranger@gmail.com

 

In this present piece of work, we have considered a lifetime distribution based on trigonometric function called SSE(e)-distribution and discuss its various properties which have not been added previously by host as well as any other authors. This distribution is useful and a good contribution in research under trigonometric function. We are deriving some more useful properties such as moments, conditional moments, mean deviation about mean, mean deviation about median, order statistics etc. Estimation of parameter has been done for both classical and Bayesian paradigms under Type-II censored sample. Simulation study has also been carried out to know the progress of the estimators in the sense of having smallest risk (over the sample space) at the long-run use.

 

Keywords: SSE(e)-distribution, Type-II censoring, Bayes estimator, MLE, Gauss-Laguerre method, risk function

 

 

Cite: Kumar, P., Kumar, D., Kumar, P., Singh U. Classical and Bayesian Estimation of Parameter of SSE(e)-distribution Under Type-II Censored Data. Reliability: Theory & Applications. 2022December 4(71): 297-310. https://doi.org/10.24412/1932-2321-2022-471-297-310

 


297-310

 

Statistical properties and estimation procedures for a new flexible two parameter lifetime distribution

 

S. K. Singh, Suraj Yadav, Abhimanyu Singh Yadav

Department of Statistics,

Banaras Hindu University, Varanasi, India.
singhsk64@gmail.com,

asybhu10@gmail.com

 

In this article, a new transformation technique based on the cumulative distribution function is proposed, the proposed transformation technique is very useful to generate a class of lifetime distribution. The various statistical properties of the proposed transformation method are studied. Further, the proposed technique is illustrated by considering exponential distribution as a baseline distribution. Various statistical properties such as survival and hazard rate, moments, mean deviation about mean and median, order statistics, moment generating function (MGF), Bonferroni’s, and Lorenz curves, entropy, stressstrength reliability have been discussed. Different classical estimation methods are used to estimate the unknown parameters. Finally, two real data sets are considered to justify the use of the proposed distribution in real scenario.

 

Keywords: Transformation technique, statistical properties, classical method of estimation, and application.

 

 

Cite: Singh, S. K., Suraj Yadav, Abhimanyu Singh Yadav Statistical properties and estimation procedures for a new flexible two parameter lifetime distribution. Reliability: Theory & Applications. 2022December 4(71): 311-330. https://doi.org/10.24412/1932-2321-2022-471-311-330

 


311-330

 

A DIFFERENT INITIATIVE TO FIND AN OPTIMAL SOLUTION TO THE TRIANGULAR FUZZY TRANSPORTATION PROBLEM BY IMPLEMENTING THE ROW-COLUMN MAXIMA METHOD

 

A. Kokila

Research Scholar, Department of Mathematics,

SAS, VIT, Vellore, Tamil Nadu, India
kokila.a@vit.ac.in
 

G. Deepa 

Assistant Professor, Department of Mathematics,

SAS, VIT, Vellore, Tamil Nadu, India
deepa.g@vit.ac.in

 

In this paper, we discussed an issue in fuzzy transportation problem, which involves fuzzy costs, fuzzy supply, and fuzzy product needs. The goal of this article is to convey the item from point of origin to point of destination at the least possible cost. For fuzzy transportation problems with balance and unbalance types, the proposed technique provides a superior optimal. Transportation costs, supply, and demand are represented by generalized triangular fuzzy numbers using this proposed named Row - Column Maxima Method (RCMM). A numerical example of a fuzzy transportation problem is illustrated and the solution is compared with the outcomes of other approaches. This method reduces iterations and which help to understand and implement easily in real life applications.

 

Keywords: Fuzzy set, Fuzzy Number, Triangular fuzzy number, Fuzzy Transportation problem, RCM- Method, Fuzzy optimal solution.

 

 

Cite: A. Kokila, G. Deepa A DIFFERENT INITIATIVE TO FIND AN OPTIMAL SOLUTION TO THE TRIANGULAR FUZZY TRANSPORTATION PROBLEM BY IMPLEMENTING THE ROW-COLUMN MAXIMA METHOD. Reliability: Theory & Applications. 2022December 4(71): 331-343. https://doi.org/10.24412/1932-2321-2022-471-331-343

 


331-343

 

A METHOD FOR GENERATING LIFETIME MODELS AND ITS APPLICATION TO REAL DATA

 

Fasna K 

University of Calicut
fasna.asc@gmail.com 

 

In the present work, we are going to propose a new transformation called Beta transformation. The new model includes the exponential distribution as a special case and it is known as Beta transformed exponential(BTE) distribution. We have been obtained its various statistical properties such as moments, moment generating function, median, hazard rate function, entropies, and order statistics. Parameters of BTE distribution are estimated by the method of maximum likelihood, Cramer-von-Mises and method of least square. Monte Carlo simulation is performed in order to investigate the performance of these estimates. Finally, two data sets have been analyzed to show how the proposed model works in practice.

 

Keywords: Cramer-von-Mises method, Exponential distribution, Hazard rate function, Method of maximum likelihood, Method of least squares

 

 

Cite: Fasna, K. A METHOD FOR GENERATING LIFETIME MODELS AND ITS APPLICATION TO REAL DATA. Reliability: Theory & Applications. 2022December 4(71): 344-357. https://doi.org/10.24412/1932-2321-2022-471-344-357

 


344-357

 

Second Order Sliding Mode Control for Robust Performance of the Systems

 

V. S. Biradar, G. M. Malwatkar

Instrumentation Engineering

Department Government College

of Engineering Jalgaon-425002 India
vjaybiradar@gmail.com, gajananm@gmail.com

 

An integral PID control sliding surface with first order filter is proposed in this paper to the systems with single-input single-output (SISO). In this The developed sliding mode controller results well, even though there are differences in the model of the system via parametric uncertainty. To verify its applicability to disturbances, the presented work validates the controller performance with the application of an external load. An integral and filtered type sliding surface has advantages in terms of the stability of the systems. The proposed controller properties of stability and robustness are proven by the Lyapunov’s stability theorem. By the adoption of switching gain with predetermined parameters of system, the chattering problem phenomenon is greatly minimized. Therefore, the proposed controller in this work is appropriate for extended use in real world systems. In this method proposed control is verified using simulation examples and results for its performance. It will be compared to a similar controller shown in the previous literature work.

 

Keywords: Integral sliding mode control, Robustness, Stability, Uncertain systems

 

 

Cite: Biradar, V. S., Malwatkar,  G. M. Second Order Sliding Mode Control for Robust Performance of the Systems. Reliability: Theory & Applications. 2022December 4(71): 358-370. https://doi.org/10.24412/1932-2321-2022-471-358-370

 


358-370

 

An Effective Sentiment Analysis in Hindi-English Code-Mixed Twitter Data using Swea Clustering and Hybrid BLSTM-CNN Classification

 

Abhishek Kori

Research School, Information Technology,

Shri Vaishnav Vidyapeeth Vishwavidyalaya,
Indore, Madhya Pradesh 453111, India.

abhishekkoriphd05@gmail.com

 

Jigyasu Dubey

Professor, Head Information Technology

Department, Shri Vaishnav Vidyapeeth Vishwavidyalaya,
Indore, Madhya Pradesh 453111, India.
 

Sentiment Analysis is the process of examining the individual’s emotions. In tweet sentiment analysis, opinions in messages are categorized into positive, negative and neutral categories. A clustering-based classification approach is used to increase the accuracy level and enhance the performance in sentiment classification. The input dataset comprises of Hindi-English code-mixed text data. Initially, the input text data is pre-processed with different pre-processing techniques such as stop word removal, tokenization, Stemming, lemmatization. This effectively pre-processes the data and makes it appropriate for further processing. Afterwards, effective features such as Count Vectors, Modified term frequency-inverse document frequency (MTF-IDF), Feature hashing, Glove feature and Word2vector features are extracted for enhancing the classification performance. Afterwards, Sentiment word embedding-based agglomerative (SWEA) clustering is presented for effective sentiment feature clustering. Finally, a hybrid Bidirectional long shortterm memory-convolutional neural network (Hybrid BLSTM-CNN) is used to accurately classify tweet sentiments into positive, negative, and neutral. Here, modified horse herd optimization (MHHO) approach is used for weight optimization in Hybrid BLSTM-CNN. This optimization approach further enhances the performance of classification. The dataset used for the implementation is a Hindi-English mixed dataset. The experimental result significantly improves the different existing approaches in terms of accuracy, precision, recall, and F-measure.

 

Keywords: Sentiment Analysis, Hindi-English, Twitter, code-mixed text data, modified horse herd optimization

 

 

Cite: Abhishek Kori, Jigyasu Dubey An Effective Sentiment Analysis in Hindi-English Code-Mixed Twitter Data using Swea Clustering and Hybrid BLSTM-CNN Classification. Reliability: Theory & Applications. 2022December 4(71): 371-391. https://doi.org/10.24412/1932-2321-2022-471-371-391

 


371-391

 

Confidence intervals for the reliability characteristics via different estimation methods for the power Lindley model

 

Abhimanyu S.Yadav

Department of Statistics, Banaras

Hindu University, Varanasi, India.

abhistats@bhu.ac.in
 

P. K. Vishwakarma

Department of Mathematics and

Statistics, MLSU, Udaipur, Rajasthan, India.

vpradeep4u@gmail.com
 

H. S. Bakouch

Department of Mathematics,

Faculty of Science, Tanta University, Tanta, Egypt.

hassan.bakouch@science.tanta.edu.eg
 

Upendra Kumar

Department of Statistics,

U.P. College, Varanasi, India.

ukumarupc@gmail.com
 

S. Chauhan

Department of Statistics, Central University

of Rajasthan, Rajasthan, India.

2014imsst021@curaj.ac.in

 

In this article, classical and Bayes interval estimation procedures have been discussed for the reliability characteristics, namely mean time to system failure, reliability function, and hazard function for the power Lindley model and its special case. In the classical part, maximum likelihood estimation, maximum product spacing estimation are discussed to estimate the reliability characteristics. Since the computation of the exact confidence intervals for the reliability characteristics is not directly possible, then, using the large sample theory, the asymptotic confidence interval is constructed using the above-mentioned classical estimation methods. Further, the bootstrap (standard-boot, percentile-boot, students t-boot) confidence intervals are also obtained. Next, Bayes estimators are derived with a gamma prior using squared error loss function and linex loss function. The Bayes credible intervals for the same characteristics are constructed using simulated posterior samples. The obtained estimators are evaluated by the Monte Carlo simulation study in terms of mean square error, average width, and coverage probabilities. A real-life example has also been illustrated for the application purpose.

 

Keywords: Point estimation, Interval estimation of RC, MCMC method.

 

 

Cite: Abhimanyu S.Yadav, P. K. Vishwakarma, H. S. Bakouch, Upendra Kumar, S. Chauhan Confidence intervals for the reliability characteristics via different estimation methods for the power Lindley model. Reliability: Theory & Applications. 2022December 4(71): 392-412. https://doi.org/10.24412/1932-2321-2022-471-392-412

 


392-412

 

Inventory Model with Truncated Weibull Decay Under Permissible Delay in Payments and Inflation Having Selling Price Dependent Demand

 

K Srinivasa Rao, M Amulya, K Nirupama Devi

Department of Statistics, Andhra

University, Visakhapatnam, India
ksraoau@yahoo.co.in,

amulya.mothrapu@gmail.com,

knirupamadevi@gmail.com

 

For optimal utilization of resources, the inventory models are required in several places such as market yards, production processes, warehouses, oil exploration industries and food vegetable markets. Huge work has been produced by several researchers in inventory models for obtaining optimal ordering quantity and pricing policies. This paper addresses an EOQ model for deteriorating items having Weibull decay under inflation and permissible delay in payments. It is considered that the demand of items is a function of selling price. It is further assumed that the decay of items starts after certain period of time which can be well characterized by truncated Weibull probability model for the life time of the commodity. The optimal ordering and pricing policies of this system are derived and analyzed in the light of the input parameters and costs. Through sensitivity analysis it is demonstrated that the delay in the payments and rate of inflation have significant effect on the optimal policies. This model is very useful in the analyzing market yards where sea foods, vegetables, fruits, edible oils are stored and distributed.

 

Keywords: EOQ model, selling price depended demand, truncated decay

 

 

Cite: K Srinivasa Rao, M Amulya, K Nirupama Devi Inventory Model with Truncated Weibull Decay Under Permissible Delay in Payments and Inflation Having Selling Price Dependent Demand. Reliability: Theory & Applications. 2022December 4(71): 413-428. https://doi.org/10.24412/1932-2321-2022-471-413-428

 


413-428

 

Comparison of Queuing Performance Using Fuzzy Queuing Model and Intuitionistic Fuzzy Queuing Model with Infinite capacity / /1 FM FD

 

S. Aarthi, M. Shanmugasundari 

Department of Mathematics &

Statistics, College of Science and Humanities,

SRM Institute of Science and Technology,

Kattankulathur, Tamil Nadu, India.
as6238@srmist.edu.in,

shanmugm@srmist.edu.in

 

Under assorted fuzzy numbers, we portray an FM/FD/1 queuing model with an unrestrained limit. The foremost target of this paper is to compare the efficacy of an FM/FD/1 queuing model based on fuzzy queuing theory and intuitionistic fuzzy queuing theory. Birth (arrival) and death (service) rates are thought to be triangular and triangular intuitionistic fuzzy numbers. The fuzzy consequence of unpredictability modeling is a fuzzy random variable because arbitrary events can only be recognized in an undefined manner. As a consequence, it is essential to interpret the direct correlation between volatility and vagueness. The lining miniature's prosecution dimensions are fuzzified and then examined using arithmetic and logical operations. The evaluation metrics for the fuzzy queuing theory model are furnished as a range of outcomes, meanwhile, the intuitionistic fuzzy queuing theory model has plenty of virtues. An approach is conducted to ascertain quality measures using a methodological approach in which fuzzy values are preserved without being incorporated into crisp values, allowing us to draw scientific conclusions in an uncertain environment. The arithmetical precepts are defined in dealing with various fuzzy numbers to test the model's technical feasibility. A comparison illustration is constituted for each fuzzy number.

 

Keywords: queuing theory, triangular fuzzy number, triangular intuitionistic fuzzy number, infinite capacity.

 

 

Cite: S. Aarthi, M. Shanmugasundari Comparison of Queuing Performance Using Fuzzy Queuing Model and Intuitionistic Fuzzy Queuing Model with Infinite capacity / /1 FM FD. Reliability: Theory & Applications. 2022December 4(71): 429-442. https://doi.org/10.24412/1932-2321-2022-471-429-442

 


429-442

 

ANALYSIS OF A TWO-STATE PARALLEL SERVERS RETRIAL QUEUEING MODEL WITH BATCH DEPARTURES

 

Neelam Singla

Associate Professor, Department

of Statistics Punjabi University Patiala,

Punjab (147002)
neelgagan2k3@yahoo.co.in
 

Sonia Kalra 

Assistant Professor, University

School of Business Chandigarh

University, Gharuan, Mohali

(Punjab) 140413
soniakalra276@gmail.com

 

This paper deals with the transient state behavior of an M/M/1 retrial queueing model contains two parallel servers with departures occur in batches. At the arrival epoch, if all servers are busy then customers join the retrial group. Whereas, if the customers find any of one server is free then they join the free server and start its service immediately. Here, we assume that primary customers arrive according to Poisson process. The retrial customers also follow the same fashion. Service time follows an exponential distribution. Explicit time dependent probabilities of exact number of arrivals and exact number of departures when both servers are free or when one server is busy or when both servers are busy are obtained by solving the difference differential equation recursively. Some important verification and conversion of two-state model into single state are also discussed. Some of the existing results in the form of special cases have been deduced.

 

Keywords: Retrial, Queueing, Arrivals, Departures, Batch

 

 

Cite: Neelam Singla, Sonia Kalra ANALYSIS OF A TWO-STATE PARALLEL SERVERS RETRIAL QUEUEING MODEL WITH BATCH DEPARTURES. Reliability: Theory & Applications. 2022December 4(71): 443-452. https://doi.org/10.24412/1932-2321-2022-471-443-452

 


443-452

 

The Transmuted Weibull Frechet Distribution: Properties and Applications

 

Joseph Thomas Eghwerido

Department of Statistics, Federal

University of Petroleum Resources,

Effurun, Delta State, Nigeria
eghwerido.joseph@fupre.edu.ng

 

The behaviour of everyday real life processes played a greater role in distribution theory. Thus, this article proposes a transmuted Weibull Frechet (TWFr) distribution for modeling real life datasets. Of most important, the statistical properties of the TWFr distribution such as the hazard, survival functions, order statistic, quantile, odd, cumulative functions were derived and examined. A simulation study to examine the performance of the TWFr distribution was also conducted. A glass fiber data and breaking stress of carbon data real life application were used to showcase the performance of the proposed model. The results showed that the TWFr distribution competes favourably well with other types of continuous distributions in the Frechet family of distributions.

 

Keywords: Frechét distribution, Hazard rate function, Order statistics, Transmutation, Weibull distribution.

 

 

Cite: Joseph Thomas Eghwerido The Transmuted Weibull Frechet Distribution: Properties and Applications. Reliability: Theory & Applications. 2022December 4(71): 453-468. https://doi.org/10.24412/1932-2321-2022-471-453-468

 


453-468

 

AN IMPROVED DIFFERENCE CUM – EXPONENTIAL RATIO TYPE ESTIMATOR IN RANKED SET SAMPLING

 

Khalid Ul Islam Rather 

Division of Statistics and Computer

Science, SKUAST-Jammu,India.1,3
khalidstat34@gmail.com

 

Asad Ali

Department of Economics and

Statistics, University of Management

and Technology, Lahore, Pakistan
ranaasadali23@gmail.com

 

M. Iqbal Jeelani 

Division of Statistics and Computer

Science, SKUAST-Jammu,India.1,3

jeelani.miqbal@gmail.com

 

Ranked set sampling is an approach to data collection originally combines simple random sampling with the field investigator's professional knowledge and judgment to pick places to collect samples. Alternatively, field screening measurements can replace professional judgment when appropriate and analysis that continues to stimulate substantial methodological research. The use of ranked set sampling increases the chance that the collected samples will yield representative measurements. This results in better estimates of the mean as well as improved performance of many statistical procedures. Moreover, ranked set sampling can be more cost-efficient than simple random sampling because fewer samples need to be collected and measured. The use of professional judgment in the process of selecting sampling locations is a powerful incentive to use ranked set sampling. This paper is devoted to the study, we introduce an approach to the mean estimators in ranked set sampling. The amount of information carried by the auxiliary variable is measured with the on populations and samples and to use this information in the estimator, the basic ratio and the generalized exponential ratio estimators are as an improved form of a difference cum exponential ratio type estimator under the ranked set sampling in order to estimate the population mean of study variate Y using single auxiliary variable X. The expressions for the mean squared error of propose estimator under ranked set sampling is derived and theoretical comparisons are made with competing estimators. We show that the proposed estimator has a lower mean square error than the existing estimators. In addition, these theoretical results are supported with the aid of some real data sets using R studio. Therefore, Under RSS architecture, a better difference cum exponential ratio type estimator has been suggested. The estimator's mathematical form has been developed, and its efficiency requirements have been developed in relation to various already-existing estimators from the literature. By imputing various values for the constants used in the creation of our proposed estimator, we also provide several specific situations of our estimator.

 

Keywords: Ranked Set Sampling; Exponential Ratio Type Estimator; Ratio Estimator, Mean Square Error (MSE), Efficiency, R studio

 

 

Cite: Khalid Ul Islam Rather, Asad Ali, M. Iqbal Jeelani AN IMPROVED DIFFERENCE CUM – EXPONENTIAL RATIO TYPE ESTIMATOR IN RANKED SET SAMPLING. Reliability: Theory & Applications. 2022December 4(71): 469-476. https://doi.org/10.24412/1932-2321-2022-471-469-476

 


469-476

 

Bayesian Analysis of Type II Generalized Topp–Leone Accelerated Failure Time Models Using R and Stan

 

Devashish, Athar Ali Khan

Department of Statistics and Operations

Research Aligarh Muslim University,

Aligarh-202002, India
devashishstatistics@gmail.com,

atharkhan1962@gmail.com

 

With a Bayesian framework, the current study intends to fit the Type II generalized Topp–Leone-G (TIIGTL-G) model as an accelerated failure time (AFT) model to censored survival data. In this paper, we have obtained and analysed three AFT models using Type II Generalized Topp-Leone (TIIGTL) distribution as generator and considering Weibull, Exponential, and Log-Logistic as a baseline distribution. The fitting of these models to the censored survival data is done with the help of R and STAN. A comparison of these two models is conducted, and the best model is chosen using the Bayesian model evaluation criteria LOOIC and WAIC.

 

Keywords: Type II generalized Topp–Leone G Model, Bayesian Survival Modelling, Censored data, Leave one out information criteria, STAN

 

 

Cite: Devashish, Athar Ali Khan Bayesian Analysis of Type II Generalized Topp–Leone Accelerated Failure Time Models Using R and Stan. Reliability: Theory & Applications. 2022December 4(71): 477-493. https://doi.org/10.24412/1932-2321-2022-471-477-493

 


477-493

 

Reliability and Performance Analysis of a Complex Manufacturing System with Inspection facility using Copula Methodology

 

Surabhi Sengar

G.B. Pant University of Agriculture

& Technology, Pantnagar, Uttarakhand. India
sursengar@gmail.com

 

Mangey Ram 

Graphic Era deemed to be University, Dehradun,

Uttarakhand, India
drmswami@yahoo.com

 

This paper deals with the assessment of various reliability factors of a real-life manufacturing system having inspection facility. This multistate manufacturing system have five workstations those are connected in series configuration as: W1, W2, W3, W4, W5. Workstations W2 and W4 has the configuration 2-out-of -3: G and 1-out-of-3: F. Due to failure of the any of the workstation, whole manufacturing system can completely fail. Apart from this machine failure can also make system down. To avoid sudden failure in the system pre-emptive maintenance strategy has been adopted. This is a corrective maintenance action before a failure occurs and scheduled during off days. Risk analysis is done because of fault of W5 workstation in material quality inspection. Probability distributions like exponential time distribution is followed by all failures and general time distribution by all repairs. To study the probabilistic behavior of the system in different possible transition states, Markov process have been used. Supplementary variable technique and copula method of finding joint probability distribution have been used to obtained various reliability features such as steady state behavior of the system, reliability function, availability, Mean time to failure, sensitivity analysis and profit analysis.

 

Keywords: Reliability analysis, Mean time to failure, Availability, Sensitivity analysis, Risk analysis

 

 

Cite: Surabhi Sengar, Mangey Ram Reliability and Performance Analysis of a Complex Manufacturing System with Inspection facility using Copula Methodology. Reliability: Theory & Applications. 2022December 4(71): 494-508. https://doi.org/10.24412/1932-2321-2022-471-494-508

 


494-508

 

On the Minimum of Exponential and Teissier Distributions

 

Vishwa Prakash Jha

Department of Mathematics,

National Institute of Technology,

Tiruchirappalli 620015, India
vpjha.nitt@gmail.com

 

V. Kumaran

Department of Mathematics,

National Institute of Technology,

Tiruchirappalli 620015, India

Corresponding author: kumaran@nitt.edu

 

 

In reliability theory minimum of two random variables has a significant meaning, and models with increasing failure rates play a vital role. Motivated by these facts, in this article, a two-parameter lifetime distribution with an increasing failure rate is constructed by considering the method of a minimum of two independent random variables following the exponential and Teissier distributions and studied in detail. Several exciting features, such as moments, quantiles, Bonferroni and Lorenz curves, entropies, stress–strength reliability, moments of a residual lifetime, and order statistics, are derived for the proposed distribution. For the estimation purpose, eight different techniques have been used, including maximum likelihood, ordinary least square, weighted least square, Cramer-von Mises, maximum product spacing, Anderson-Darling, right-tailed Anderson-Darling, and bootstrapping (parametric and nonparametric). The performance of these estimators is compared using three real datasets. The exact Fisher information matrix elements are derived, and confidence intervals based on the information matrix and bootstrapping techniques are constructed. A simulation study is carried out to see the efficiency of the maximum likelihood in terms of mean square error and bias. Negative log-likelihood, Akaike information criteria, Bayesian information criteria, Consistent Akaike information criteria, and Hannan-Quinn information criteria are the goodness-of-fit statistics employed. Furthermore, other nonparametric test statistics such as Kolmogorov-Smirnov, Anderson-Darling, and Cramer-von Mises are used for model selection. Moreover, three real datasets related to epidemiology, seismology, and reliability are modeled and compared with exponential, exponentiated exponential, Lindley, exponentiated Lindley, Rayleigh, exponentiated Rayleigh, Gompertz, exponentiated Gompertz, Weibull, and exponentiated Weibull distributions to demonstrate how the suggested model performs in practice. And it is observed that the proposed distribution provides a better fit among all considered models, according to most of the test statistics. The proposed lifetime distribution is unimodal and capable of modeling positive datasets with an increasing failure rate which contains Gompertz one-parameter model as a particular case. It is a simple model with only two parameters resulting from expressions for different characteristics that are analytically tractable. So, it is expected that it will be helpful in various disciplines where such types of data exist, such as reliability, lifetime modeling, and survival analysis.

 

Keywords: Probability distribution, Moments, Information Matrix, Maximum likelihood estimator, Bootstrap, Simulations

 

 

Cite: Vishwa Prakash Jha, V. Kumaran On the Minimum of Exponential and Teissier Distributions. Reliability: Theory & Applications. 2022December 4(71): 509-520. https://doi.org/10.24412/1932-2321-2022-471-509-520

 


509-520

 

ON CONSISTENCY OF BAYESIAN PARAMETER ESTIMATORS FOR A CLASS OF ERGODIC MARKOV MODELS

 

A.I. Nurieva

National Research University Higher

School of Economics, Russia
ai_nurieva@mail.ru 

 

A.Yu. Veretennikov

Institute for Information

Transmission Problems, Russia

ayv@iitp.ru

 

The consistency of the Bayesian estimation of a parameter is shown for a class of ergodic discrete Markov chains. J.L. Doob’s method was used, offered earlier for the i.i.d. situation. The result may be useful in the reliability theory for models with unknown parameters, in the risk management in financial mathematics, and in other applications.

 

Keywords: Bayesian estimator; consistency; ergodic Markov chain

 

 

Cite: A.I. Nurieva, A.Yu. Veretennikov ON CONSISTENCY OF BAYESIAN PARAMETER ESTIMATORS FOR A CLASS OF ERGODIC MARKOV MODELS. Reliability: Theory & Applications. 2022December 4(71): 521-529. https://doi.org/10.24412/1932-2321-2022-471-521-529

 


521-529

 

On the Degree of Mutual Dependence of Three Events

 

Valentin Vankov Iliev

Institute of Mathematics and Informatics
Bulgarian Academy of Sciences, Sofia, Bulgaria
viliev@math.bas.bg

 

We define degree of mutual dependence of three events in a probability space by using Boltzmann-Shannon entropy function of an appropriate variable distribution produced by these events and depending on four parameters varying, in general, within of a polytope. It turns out that the entropy function attains its absolute maximum exactly when the three events are mutually independent and its absolute minimum at some vertices of the polytope where the events are "maximally" dependent. By composing the entropy function with an appropriate linear function we obtain a continuous "degree of mutual dependence" function with the same domain and the interval [0, 1] as a target. It attains value 0 when the events are mutually independent (the entropy is maximal) and value 1 when they are "maximally" dependent (the entropy is minimal). A link is available for downloading a Java code which evaluates the degree of mutual dependence of three events in the classical case of a sample space with equally likely outcomes.

 

Keywords: entropy; average information; degree of dependence; probability space; probability distribution; experiment in a sample space; linear system; affine isomorphism; classification space

 

 

Cite: Iliev, V. On the Degree of Mutual Dependence of Three Events. Reliability: Theory & Applications. 2022December 4(71): 530-542. https://doi.org/10.24412/1932-2321-2022-471-530-542


530-542

 

Power Length Biased Weighted Lomax Distribution

 

Shamshad Ur Rasool

Department of Statistics, University

of Kashmir, Srinagar, India
srasool92@gmail.com

 

S.P. Ahmad 
Department of Statistics, University

of Kashmir, Srinagar, India
sprvz@yahoo.com

 

In this research paper, we have proposed the Power Length Biased Weighted Lomax Distribution (PLBWLD) as a new probability model. Moments, moment generating function, characteristic function, cumulant generating function, and reliability analysis such as survival function, hazard rate, reverse hazard rate, cumulative hazard function, and mills ratio are among the statistical features of PLBWLD that have been obtained here. Order statistics and PLBWLD’s generalized entropy are also calculated. Maximum likelihood estimation is used to estimate the parameters of the model. Finally for demonstration purposes an application to the real data sets is provided to understand the new probability model’s performance and flexibility.

 

Keywords: Length biased weighted Lomax distribution, power length biased weighted Lomax distribution, hazard rate function, moments, maximum likelihood estimation, order statistics, generalized entropy.

 

 

Cite: Shamshad Ur Rasool, S.P. Ahmad Power Length biased weighted lomax distribution. Reliability: Theory & Applications. 2022December 4(71): 543-558. https://doi.org/10.24412/1932-2321-2022-471-543-558

 


543-558

 

Inferences for Two Parameter Teissier Distribution in Case of Fuzzy Progressively Censored Data

 

Sudhanshu Vikram Singh

Department of Mathematics, Institute of Infrastructure
Techonology Research And Management, Ahmedabad, India
sudhanshu.vikram22061991@gmail.com

 

Vikas Kumar Sharma

Department of Statistics, Institute of Science,

Banaras Hindu University, Varanasi, India

vikasstats@rediffmail.com

 

Sanjay Kumar Singh 

Department of Statistics, Institute of Science,

Banaras Hindu University, Varanasi, India

singhsk64@gmail.com

 

In process of observing data, it is sometimes not possible to obtain data precisely and fuzzy methods are useful for analyzing such data sets. In this article, we propose location-scale family of the Teissier distribution for fitting fuzzy censored data sets. The maximum likelihood, least squares and Bayes estimators of the parameters of the Teissier distribution are constructed in the presence of the progressively fuzzy censored samples. In addition to that statistical properties of the distribution are also derived. Fitting of the tensile strengths of the carbon fibers is done using the proposed distribution with comparison to the location-scale families of the exponential, Maxwell and Lindley distributions. We found that the Teissier distribution can be effectively used for fitting complete and fuzzy censored data as well.

 

Keywords: Location-scale Teissier distribution, Fuzzy lifetime data, Type-II progressive censoring scheme, Mean residual life, Moments, Maximum likelihood estimator, Least squares estimator, Bayes estimator

 

 

Cite: Sudhanshu Vikram Singh, Vikas Kumar Sharma, Sanjay Kumar Singh Inferences for two parameter Teissier distribution in case of fuzzy progressively censored data. Reliability: Theory & Applications. 2022December 4(71): 559-573. https://doi.org/10.24412/1932-2321-2022-471-559-573

 


559-573

 

Record-based Transmuted Power Lomax Distribution: Properties and its Applications in Reliability

 

K.M. Sakthivel, V. Nandhini 

Department of Statistics, Bharathiar University,

Coimbatore-641046, Tamilnadu, India
sakthithebest@buc.edu.in, nandhinivtvt@gmail.com

 

In this paper, we consider a record-based transmuted version of Power Lomax distribution and it is named as Record-based Transmuted Power Lomax (RTPL) distribution. Further, we present several statistical properties of the proposed distribution such as moments, quantiles, stochastic ordering, order statistics, and its explicit expressions. Some of its reliability measures such as survival function, hazard function, cumulative hazard function, mean residual time, and mean inactivity time is also discussed. The maximum likelihood method is used to estimate the parameters of the RTPL distribution and this new extended model is applied to a real datasets to access the suitability and applicability of the model based on well-known information criteria and test for goodness of fit. The simulation study is performed to verify the efficiency and asymptotic behavior of the maximum likelihood estimators.

 

Keywords: Record-based Transmuted map, Power Lomax distribution, Lambert W function, Maximum Likelihood Estimation.

 

 

Cite: K.M. Sakthivel, V. Nandhini Record-based Transmuted Power Lomax Distribution: Properties and its Applications in Reliability. Reliability: Theory & Applications. 2022December 4(71): 574-592. https://doi.org/10.24412/1932-2321-2022-471-574-592

 


574-592

 

Censoring and Reliability Inferences for Power Lindley Distribution with Application on Hematologic Malignancies Data

 

Abbas Pak,
Department of Computer Sciences,

Shahrekord University, Shahrekord, Iran
abbas.pak1982@gmail.com

 

Mohamed E. Ghitany

Department of Statistics and Operations

Research, Faculty of Science, Kuwait University, Kuwait
ghitany@kuc01.kuniv.edu.kw

 

In this paper, by using progressively type II censored samples, we discuss on estimation of the parameters of a power Lindley model. Maximum likelihood estimates (MLE) and approximate confidence intervals of the unknown parameters are obtained. Then, considering squared error loss function, the Bayes estimates of the parameters are derived. Because there are not closed forms for the Bayes estimates, we use Tierney and Kadane’s technique, to calculate the approximate Bayes estimates. Further, the results are extended to the stress-strength reliability parameter involving two power Lindley distributions. The ML estimate of the stress-strength parameter and its approximate confidence interval are obtained. Then, the Bayes estimates and highest posterior density credible interval of the involved parameter are obtained by using a Markov Chain Monte Carlo method. To evaluate the performances of maximum likelihood and Bayes estimators simulation studies are conducted and two examples of real data sets are provided to illustrate the procedures.

 

Keywords: Power Lindley model, progressive type II censoring, Bayesian approach, Maximum likelihood method, Stress-strength reliability

 

 

Cite: Abbas Pak, Mohamed E. Ghitany Censoring and reliability inferences for power Lindley distribution with application on hematologic malignancies data. Reliability: Theory & Applications. 2022December 4(71): 593-610. https://doi.org/10.24412/1932-2321-2022-471-593-610

 


593-610