10/11/2024


We observe that every mixture η of a state ρ satisfying the conjecture with any incoherent state σ also satisfies the conjecture. We also observe that when the von Neumann entropy is defined by the natural logarithm ln instead of log 2 , the reduced relative entropy measure of coherence C ¯ r ( ρ ) = - ρ diag ln ρ diag + ρ ln ρ satisfies the inequality C ¯ r ( ρ ) ≤ C ℓ 1 ( ρ ) for any state ρ .Multidimensional datapoint clouds representing large datasets are frequently characterized by non-trivial low-dimensional geometry and topology which can be recovered by unsupervised machine learning approaches, in particular, by principal graphs. Principal graphs approximate the multivariate data by a graph injected into the data space with some constraints imposed on the node mapping. Here we present ElPiGraph, a scalable and robust method for constructing principal graphs. ElPiGraph exploits and further develops the concept of elastic energy, the topological graph grammar approach, and a gradient descent-like optimization of the graph topology. The method is able to withstand high levels of noise and is capable of approximating data point clouds via principal graph ensembles. This strategy can be used to estimate the statistical significance of complex data features and to summarize them into a single consensus principal graph. ElPiGraph deals efficiently with large datasets in various fields such as biology, where it can be used for example with single-cell transcriptomic or epigenomic datasets to infer gene expression dynamics and recover differentiation landscapes.In the process of digital micromirror device (DMD) digital mask projection lithography, the lithography efficiency will be enhanced greatly by path planning of pattern transfer. This paper proposes a new dual operator and dual population ant colony (DODPACO) algorithm. Firstly, load operators and feedback operators are used to update the local and global pheromones in the white ant colony, and the feedback operator is used in the yellow ant colony. The concept of information entropy is used to regulate the number of yellow and white ant colonies adaptively. Secondly, take eight groups of large-scale data in TSPLIB as examples to compare with two classical ACO and six improved ACO algorithms; the results show that the DODPACO algorithm is superior in solving large-scale events in terms of solution quality and convergence speed. Thirdly, take PCB production as an example to verify the time saved after path planning; the DODPACO algorithm is used for path planning, which saves 34.3% of time compared with no path planning, and is about 1% shorter than the suboptimal algorithm. The DODPACO algorithm is applicable to the path planning of pattern transfer in DMD digital mask projection lithography and other digital mask lithography.Quantum Szilard engine constitutes an adequate interplay of thermodynamics, information theory and quantum mechanics. Szilard engines are in general operated by a Maxwell's Demon where Landauer's principle resolves the apparent paradoxes. Here we propose a Szilard engine setup without featuring an explicit Maxwell's demon. https://www.selleckchem.com/products/zilurgisertib-fumarate.html In a demonless Szilard engine, the acquisition of which-side information is not required, but the erasure and related heat dissipation still take place implicitly. We explore a quantum Szilard engine considering quantum size effects. We see that insertion of the partition does not localize the particle to one side, instead creating a superposition state of the particle being in both sides. To be able to extract work from the system, particle has to be localized at one side. The localization occurs as a result of quantum measurement on the particle, which shows the importance of the measurement process regardless of whether one uses the acquired information or not. link2 In accordance with Landauer's principle, localization by quantum measurement corresponds to a logically irreversible operation and for this reason it must be accompanied by the corresponding heat dissipation. This shows the validity of Landauer's principle even in quantum Szilard engines without Maxwell's demon.Nonlinear non-equilibrium thermodynamic relations have been constructed based on the generalized Ehrenfest-Klein model. Using these relations, the behavior of the entropy and its production in time at arbitrary deviations from equilibrium has been studied. It has been shown that the transient fluctuation theorem is valid for this model if a dissipation functional is treated as the thermodynamic entropy production.Multi-principal-element alloys share a set of thermodynamic and structural parameters that, in their range of adopted values, correlate to the tendency of the alloys to assume a solid solution, whether as a crystalline or an amorphous phase. Based on empirical correlations, this work presents a computational method for the prediction of possible glass-forming compositions for a chosen alloys system as well as the calculation of their critical cooling rates. The obtained results compare well to experimental data for Pd-Ni-P, micro-alloyed Pd-Ni-P, Cu-Mg-Ca, and Cu-Zr-Ti. Furthermore, a random-number-generator-based algorithm is employed to explore glass-forming candidate alloys with a minimum critical cooling rate, reducing the number of datapoints necessary to find suitable glass-forming compositions. A comparison with experimental results for the quaternary Ti-Zr-Cu-Ni system shows a promising overlap of calculation and experiment, implying that it is a reasonable method to find candidates for glass-forming alloys with a sufficiently low critical cooling rate to allow the formation of bulk metallic glasses.Feature extraction is one of the challenging problems in fault diagnosis, and it has a direct bearing on the accuracy of fault diagnosis. Therefore, in this paper, a new method based on ensemble empirical mode decomposition (EEMD), wavelet semi-soft threshold (WSST) signal reconstruction, and multi-scale entropy (MSE) is proposed. First, the EEMD method is applied to decompose the vibration signal into intrinsic mode functions (IMFs), and then, the high-frequency IMFs, which contain more noise information, are screened by the Pearson correlation coefficient. Then, the WSST method is applied for denoising the high-frequency part of the signal to reconstruct the signal. Secondly, the MSE method is applied for calculating the MSE values of the reconstructed signal, to construct an eigenvector with the complexity measure. Finally, the eigenvector is input to a support vector machine (SVM) to find the fault diagnosis results. The experimental results prove that the proposed method, with a better classification performance, can better solve the problem of the effective signal and noise mixed in high-frequency signals. Based on the proposed method, the fault types can be accurately identified with an average classification accuracy of 100%.Traditional hypothesis-margin researches focus on obtaining large margins and feature selection. In this work, we show that the robustness of margins is also critical and can be measured using entropy. In addition, our approach provides clear mathematical formulations and explanations to uncover feature interactions, which is often lack in large hypothesis-margin based approaches. We design an algorithm, termed IMMIGRATE (Iterative max-min entropy margin-maximization with interaction terms), for training the weights associated with the interaction terms. IMMIGRATE simultaneously utilizes both local and global information and can be used as a base learner in Boosting. We evaluate IMMIGRATE in a wide range of tasks, in which it demonstrates exceptional robustness and achieves the state-of-the-art results with high interpretability.Power plants or thermal systems wherein products such as electricity and steam are generated affect the natural environment, as well as human society, through the discharging of wastes. The wastes from such plants may include ashes, flue gases, and hot water streams. The waste cost is of primary importance in plant operation and industrial ecology. Therefore, an appropriate approach for including waste cost in a thermoeconomic analysis is essential. In this study, a method to take waste cost into account in thermoeconomics to determine the production cost of products via thermoeconomic analysis is proposed. The calculation of the waste cost flow rates at the dissipative units and their allocation to system components are important to obtain the production cost of a plant.Fano's inequality is one of the most elementary, ubiquitous, and important tools in information theory. Using majorization theory, Fano's inequality is generalized to a broad class of information measures, which contains those of Shannon and Rényi. When specialized to these measures, it recovers and generalizes the classical inequalities. Key to the derivation is the construction of an appropriate conditional distribution inducing a desired marginal distribution on a countably infinite alphabet. The construction is based on the infinite-dimensional version of Birkhoff's theorem proven by Révész [Acta Math. Hungar.1962, 3, 188-198], and the constraint of maintaining a desired marginal distribution is similar to coupling in probability theory. Using our Fano-type inequalities for Shannon's and Rényi's information measures, we also investigate the asymptotic behavior of the sequence of Shannon's and Rényi's equivocations when the error probabilities vanish. This asymptotic behavior provides a novel characterization of the asymptotic equipartition property (AEP) via Fano's inequality.The famous singlet correlations of a composite quantum system consisting of two two-level components in the singlet state exhibit notable features of two kinds. One kind are striking certainty relations perfect anti-correlation, and perfect correlation, under certain joint settings. The other kind are a number of symmetries, namely invariance under a common rotation of the settings, invariance under exchange of components, and invariance under exchange of both measurement outcomes. One might like to restrict attention to rotations in the plane since those are the ones most commonly investigated experimentally. One can then also further distinguish between the case of discrete rotations (e.g., only settings which are a whole number of degrees are allowed) and continuous rotations. We study the class of classical correlation functions, i.e., generated by classical physical systems, satisfying all these symmetries, in the continuous, planar, case. We call such correlation functions classical EPR-B correlations. r a whole range of settings. It is found by a search procedure in which we randomly generate classical physical models and, for each generated model, evaluate its properties in a further Monte-Carlo simulation of the model itself.In quantum physics, two prototypical model systems stand out due to their wide range of applications. link3 These are the two-level system (TLS) and the harmonic oscillator. The former is often an ideal model for confined charge or spin systems and the latter for lattice vibrations, i.e., phonons. Here, we couple these two systems, which leads to numerous fascinating physical phenomena. Practically, we consider different optical excitations and decay scenarios of a TLS, focusing on the generated dynamics of a single phonon mode that couples to the TLS. Special emphasis is placed on the entropy of the different parts of the system, predominantly the phonons. While, without any decay, the entire system is always in a pure state, resulting in a vanishing entropy, the complex interplay between the single parts results in non-vanishing respective entanglement entropies and non-trivial dynamics of them. Taking a decay of the TLS into account leads to a non-vanishing entropy of the full system and additional aspects in its dynamics.