The predictive receiver operating characteristic (PROC) curve is a diagrammatic format with application in the statistical evaluation of probabilistic disease forecasts. The PROC curve differs from the more well-known receiver operating characteristic (ROC) curve in that it provides a basis for evaluation using metrics defined conditionally on the outcome of the forecast rather than metrics defined conditionally on the actual disease status. Starting from the binormal ROC curve formulation, an overview of some previously published binormal PROC curves is presented in order to place the PROC curve in the context of other methods used in statistical evaluation of probabilistic disease forecasts based on the analysis of predictive values; in particular, the index of separation (PSEP) and the leaf plot. An information theoretic perspective on evaluation is also outlined. Five straightforward recommendations are made with a view to aiding understanding and interpretation of the sometimes-complex patterns generated by PROC curve analysis. The PROC curve and related analyses augment the perspective provided by traditional ROC curve analysis. Here, the binormal ROC model provides the exemplar for investigation of the PROC curve, but potential application extends to analysis based on other distributional models as well as to empirical analysis.In this paper, we first study a new two parameter lifetime distribution. This distribution includes "monotone" and "non-monotone" hazard rate functions which are useful in lifetime data analysis and reliability. Some of its mathematical properties including explicit expressions for the ordinary and incomplete moments, generating function, Renyi entropy, δ-entropy, order statistics and probability weighted moments are derived. Non-Bayesian estimation methods such as the maximum likelihood, Cramer-Von-Mises, percentile estimation, and L-moments are used for estimating the model parameters. The importance and flexibility of the new distribution are illustrated by means of two applications to real data sets. Using the approach of the Bagdonavicius-Nikulin goodness-of-fit test for the right censored validation, we then propose and apply a modified chi-square goodness-of-fit test for the Burr X Weibull model. The modified goodness-of-fit statistics test is applied for the right censored real data set. Based on the censored maximum likelihood estimators on initial data, the modified goodness-of-fit test recovers the loss in information while the grouped data follows the chi-square distribution. The elements of the modified criteria tests are derived. A real data application is for validation under the uncensored scheme.This paper mainly focuses on the problem of lossy compression storage based on the data value that represents the subjective assessment of users when the storage size is still not enough after the conventional lossless data compression. To this end, we transform this problem to an optimization, which pursues the least importance-weighted reconstruction error in data reconstruction within limited total storage size, where the importance is adopted to characterize the data value from the viewpoint of users. Based on it, this paper puts forward an optimal allocation strategy in the storage of digital data by the exponential distortion measurement, which can make rational use of all the storage space. In fact, the theoretical results show that it is a kind of restrictive water-filling. It also characterizes the trade-off between the relative weighted reconstruction error and the available storage size. Consequently, if a relatively small part of total data value is allowed to lose, this strategy will improve the performance of data compression. Furthermore, this paper also presents that both the users' preferences and the special characteristics of data distribution can trigger the small-probability event scenarios where only a fraction of data can cover the vast majority of users' interests. Whether it is for one of the reasons above, the data with highly clustered message importance is beneficial to compression storage. In contrast, from the perspective of optimal storage space allocation based on data value, the data with a uniform information distribution is incompressible, which is consistent with that in the information theory.Technological innovations are not enough by themselves to achieve social and environmental sustainability in companies. Sustainable development aims to determine the environmental impact of a product and the hidden price of products and services through the concept of radical transparency. This means that companies should show and disclose the impact on the environment of any good or service. https://www.selleckchem.com/products/harringtonine.html This way, the consumer can choose in a transparent manner, not only for the price. The use of the eco-label as a European eco-label, which bases its criteria on life cycle assessment, could provide an indicator of corporate social responsibility for a given product. However, it does not give a full guarantee that the product was obtained in a sustainable manner. The aim of this work is to provide a way of calculating the value of the environmental impacts of an industrial product, under different operating conditions, so that each company can provide detailed information on the impacts of its products, information that c the product or process, using a standardised impact methodology.In theory, high key and high plaintext sensitivities are a must for a cryptosystem to resist the chosen/known plaintext and the differential attacks. High plaintext sensitivity can be achieved by ensuring that each encrypted result is plaintext-dependent. In this work, we make detailed cryptanalysis on a published chaotic map-based image encryption system, where the encryption process is plaintext Image dependent. We show that some designing flaws make the published cryptosystem vulnerable to chosen-plaintext attack, and we then proposed an enhanced algorithm to overcome those flaws.The investigation of the systemic importance of financial institutions (SIFIs) has become a hot topic in the field of financial risk management. By making full use of 5-min high-frequency data, and with the help of the method of entropy weight technique for order preference by similarities to ideal solution (TOPSIS), this paper builds jump volatility spillover network of China's financial institutions to measure the SIFIs. We find that (i) state-owned depositories and large insurers display SIFIs according to the score of entropy weight TOPSIS; (ii) total connectedness of financial institution networks reveal that Industrial Bank, Ping An Bank and Pacific Securities play an important role when financial market is under pressure, especially during the subprime crisis, the European sovereign debt crisis and China's stock market disaster; (iii) an interesting finding shows that some small financial institutions are also SIFIs during the financial crisis and cannot be ignored.