Entropi deret waktu python

Gerakan Manusia, Ortopedi, Kedokteran Olahraga dan Metode Digital (HOSD), Institut Kesehatan Luksemburg (LIH), Eich, Luksemburg

Temukan artikel oleh Matthew W. Banjir

Bernd Grimm

Gerakan Manusia, Ortopedi, Kedokteran Olahraga dan Metode Digital (HOSD), Institut Kesehatan Luksemburg (LIH), Eich, Luksemburg

Temukan artikel oleh Bernd Grimm

Masya Allah Rezakazemi, Editor

Penafian

Gerakan Manusia, Ortopedi, Kedokteran Olahraga dan Metode Digital (HOSD), Institut Kesehatan Luksemburg (LIH), Eich, Luksemburg

Universitas Teknologi Shahrood, REPUBLIK ISLAM IRAN

Minat Bersaing. Para penulis telah menyatakan bahwa tidak ada kepentingan yang bersaing

* Surel. ul. hil@doolf. wehttam

Diterima 2021 Juli 1;

Hak Cipta © 2021 Banjir, Grimm

Ini adalah artikel akses terbuka yang didistribusikan di bawah ketentuan Lisensi Atribusi Creative Commons, yang mengizinkan penggunaan, distribusi, dan reproduksi tanpa batas dalam media apa pun, asalkan penulis dan sumber asli dikreditkan

Data Terkait

S1 Gambar. Petunjuk untuk menginstal EntropyHub di MATLAB. (TIF)

kue. 0259448. s001. tif (6. 5M)

PANDUAN. 9C5BE03B-07E4-47B9-9C0E-675EEB6C5DA1

S2 Gambar. Petunjuk untuk menginstal EntropyHub dengan Python. (TIF)

kue. 0259448. s002. tif (3. 1jt)

PANDUAN. 067ED6B9-38D5-4622-AD95-E33C7C1EE44E

S3 Gambar. Petunjuk untuk menginstal EntropyHub di Julia. (TIF)

pone. 0259448. s003. tif (2. 7M)

GUID. B2D25B3E-CAA3-46E0-A891-F854024B561C

Tidak ada data yang terkait dengan naskah ini

Abstrak

An increasing number of studies across many research fields from biomedical engineering to finance are employing measures of entropy to quantify the regularity, variability or randomness of time series and image data. Entropi, yang berkaitan dengan teori informasi dan teori sistem dinamik, dapat diperkirakan dengan banyak cara, dengan metode yang baru dikembangkan terus diperkenalkan dalam literatur ilmiah. Terlepas dari meningkatnya minat dalam deret waktu entropik dan analisis gambar, ada kekurangan alat perangkat lunak sumber terbuka yang divalidasi yang memungkinkan peneliti untuk menerapkan metode ini. Hingga saat ini, paket untuk melakukan analisis entropi sering dijalankan menggunakan antarmuka pengguna grafis, tidak memiliki dokumentasi pendukung yang diperlukan, atau tidak menyertakan fungsi untuk metode entropi lanjutan, seperti entropi silang, entropi lintas multiskala, atau entropi bidimensi. Sehubungan dengan hal ini, makalah ini memperkenalkan EntropyHub, perangkat sumber terbuka untuk melakukan analisis deret waktu entropik di MATLAB, Python, dan Julia. EntropyHub (version 0. 1) menyediakan berbagai lebih dari empat puluh fungsi untuk memperkirakan entropi lintas, multiskala, lintas multiskala, dan bidimensi, masing-masing termasuk sejumlah argumen kata kunci yang memungkinkan pengguna untuk menentukan beberapa parameter dalam perhitungan entropi. Petunjuk pemasangan, deskripsi sintaks fungsi, dan contoh penggunaan dirinci sepenuhnya dalam dokumentasi pendukung, tersedia di situs web EntropyHub– www. EntropyHub. xyz. Kompatibel dengan sistem operasi Windows, Mac dan Linux, EntropyHub dihosting di GitHub, serta repositori paket asli untuk MATLAB, Python, dan Julia, masing-masing. Tujuan dari EntropyHub adalah untuk mengintegrasikan banyak metode entropi yang sudah mapan ke dalam satu sumber daya yang lengkap, menyediakan alat yang membuat analisis deret waktu entropik tingkat lanjut menjadi mudah dan dapat direproduksi

pengantar

Melalui lensa probabilitas, informasi dan ketidakpastian dapat dilihat sebagai hubungan yang berlawanan—semakin banyak ketidakpastian, semakin banyak informasi yang kita peroleh dengan menghilangkan ketidakpastian tersebut. Ini adalah prinsip di balik formulasi entropi Shannon (1) yang mengukur ketidakpastian karena berkaitan dengan proses acak []

H(X)=−∑i=1np(xi)logbp(xi)

(1)

di mana H(X) adalah entropi (H) dari urutan (X) mengingat probabilitas (p) keadaan (xi). Perpanjangan dari entropi Shannon, entropi bersyarat (2) mengukur informasi yang diperoleh tentang proses (X) tergantung pada informasi sebelumnya yang diberikan oleh proses Y,

H(X. Y)=−∑i=1np(xi. Y=y)logbp(xi. Y=y)

(2)

di mana y dapat mewakili keadaan sistem yang terpisah atau keadaan sebelumnya dari sistem yang sama. Sejak saat itu, banyak varian telah diturunkan dari entropi bersyarat, dan pada tingkat yang lebih rendah entropi Shannon, untuk memperkirakan konten informasi data deret waktu di berbagai domain ilmiah [], menghasilkan apa yang baru-baru ini disebut "alam semesta entropi" []. Alam semesta entropi ini terus berkembang karena semakin banyak metode diturunkan dengan sifat statistik yang lebih baik dari prekursornya, seperti ketahanan terhadap panjang sinyal yang pendek [–], ketahanan terhadap noise [–], ketidakpekaan terhadap fluktuasi amplitudo [–]. Selain itu, varian entropi baru sedang diidentifikasi yang mengukur variabilitas data deret waktu dalam aplikasi spesifik, termasuk penilaian penyakit jantung dari elektrokardiogram [–], dan pemeriksaan kegagalan mesin dari sinyal getaran [, ]

Karena popularitas entropi menyebar di luar bidang matematika ke mata pelajaran mulai dari neurofisiologi [–] hingga keuangan [–], ada permintaan yang muncul untuk paket perangkat lunak yang dapat digunakan untuk melakukan analisis deret waktu entropik. Perangkat lunak sumber terbuka memainkan peran penting dalam mengatasi krisis replikasi dalam sains dengan menyediakan alat algoritme tervalidasi yang tersedia untuk semua peneliti [, ]. Tanpa akses ke alat perangkat lunak ini, peneliti yang kurang melek pemrograman komputer mungkin terpaksa meminjam algoritme dari sumber yang tidak diverifikasi yang rentan terhadap kesalahan pengkodean. Selain itu, paket perangkat lunak sering berfungsi sebagai titik masuk bagi peneliti yang tidak terbiasa dengan subjek untuk mengembangkan pemahaman tentang metode yang paling umum digunakan dan bagaimana penerapannya. Poin ini sangat relevan dalam konteks entropi, sebuah konsep yang sering disalahartikan [, , ], dan di mana metode varian nama dan nomor mungkin sulit untuk diikuti. Misalnya, turunan dari algoritme entropi sampel asli [], sudah merupakan peningkatan pada entropi perkiraan [], termasuk entropi sampel yang dimodifikasi (entropi fuzzy) [], entropi multiskala (sampel) [], entropi multiskala komposit [], entropi multiskala yang disempurnakan

Beberapa paket yang menawarkan fungsi terkait entropi telah dirilis dalam beberapa tahun terakhir [–], ditujukan terutama untuk analisis data fisiologis, Tabel 1. Although these packages offer some useful tools, they lack the capacity to perform extensive data analysis with multiple methods from the cross-entropy [], bidimensional entropy [], and multiscale entropy [] families of algorithms. Additionally, the utility of these packages is also limited for several reasons. The CEPS [], EZ Entropy [] and PyBioS [] packages all operate through graphical user interfaces (GUIs) with facilities to plot and process data interactively. The interactive nature of GUIs can be beneficial when analysing small datasets but becomes burdensome when analysing large datasets where automated processing tasks are advantageous. Both the CEPS [] and EZ Entropy [] are designed for the MATLAB programming environment (MathWorks, MA, USA) which requires a purchased license in order to use. This paywall prevents many users from accessing the software and consequently impedes the replication of results achieved by using these packages. Neither PyBioS nor EZ Entropy have accompanying documentation to describe how to use the software, and neither toolbox is hosted on the native package repository for MATLAB (MathWorks File Exchange) or Python (PyPi), which facilitate direct and simplified installation and updating.

Tabel 1

Daftar sumber daya yang menyediakan alat analisis entropi

NamaBahasaAntarmukaAkses TautanDetailEntropiHubMATLABBaris Perintah• MATLAB Add-On Explorer
• Indeks Paket Python (PyPi)
• JuliaHub
• GitHub
• Repo Julia GitHub
• www. EntropyHub. xyzLihat Tabel 2 untuk daftar lengkap fungsi di versi 0. 1. EntropyHub menyediakan 18 metode entropi Basis untuk analisis data univariat (mis. g. entropi sampel, entropi fuzzy, dll. ), dan 8 Metode lintas-entropi (mis. g. entropi lintas permutasi, entropi lintas distribusi). Ada juga 4 metode entropi dua dimensi untuk analisis 2D/gambar (mis. g. entropi dispersi dua dimensi, entropi sampel dua dimensi). Ada juga beberapa varian entropi multiskala yang tersedia yang dapat memanfaatkan masing-masing metode Base dan Cross-entropy. PythonJuliaCEPS []MATLABGUIBitBucketTermasuk Shannon, Rényi, minimum, Tsallis, Kolmogorov-Sinai, kondisional, koreksi-kondisional, perkiraan, sampel, fuzzy, permutasi, distribusi, dispersi, fase, kemiringan, gelembung, spektral, diferensial, difusi, dan metode entropi multiskala. PyBios []PythonGUIKontak PenulisTermasuk sampel, fuzzy, permutasi, distribusi, dispersi, fase, metode entropi multiskala. EZ Entropy []MATLABGUIGitHubMencakup perkiraan, sampel, fuzzy, permutasi, distribusi, dan metode entropi bersyarat. PhysioNet []MATLAB C*Command Linewww. PhysioNet. orgMenyediakan fungsi mandiri untuk entropi sampel, multiskala, dan transfer*.

Buka di jendela terpisah

Tercantum di samping setiap alat adalah bahasa pemrograman yang mereka dukung, antarmuka tempat mereka beroperasi, tautan untuk mengakses perangkat lunak, dan garis besar singkat dari alat analisis entropi yang mereka sediakan.

* Implementasi pemrograman C dari entropi transfer saat ini tidak tersedia di PhysioNet

Dengan latar belakang ini, makalah ini memperkenalkan EntropyHub, perangkat sumber terbuka untuk analisis deret waktu entropik dalam lingkungan pemrograman MATLAB, Python [] dan Julia []. Menggabungkan estimator entropi dari teori informasi, teori probabilitas, dan teori sistem dinamis, EntropyHub menampilkan berbagai fungsi untuk menghitung entropi, dan entropi silang antara, data deret waktu univariat. Berbeda dengan toolbox lain yang berfokus pada entropi, EntropyHub berjalan dari baris perintah tanpa menggunakan GUI dan memberikan banyak manfaat baru, termasuk

  • ■ Fungsi untuk melakukan analisis entropi multiskala halus, komposit, komposit halus, dan hirarkis menggunakan lebih dari dua puluh lima estimator entropi dan lintas entropi yang berbeda (perkiraan entropi, entropi sampel silang, dll.)

  • ■ Fungsi untuk menghitung entropi dua dimensi dari data (gambar) dua dimensi

  • ■ Berbagai argumen fungsi untuk menentukan nilai parameter tambahan dalam perhitungan entropi, termasuk opsi untuk rekonstruksi ruang-keadaan yang tertunda waktu dan normalisasi nilai entropi jika memungkinkan

  • ■ Ketersediaan dalam berbagai bahasa pemrograman–MATLAB, Python, Julia–untuk mengaktifkan akses sumber terbuka dan menyediakan terjemahan metode lintas platform melalui sintaks fungsi yang konsisten. Sejauh pengetahuan Penulis, ini adalah perangkat khusus entropi pertama untuk bahasa Julia

  • ■ Kompatibel dengan sistem operasi Windows, Mac dan Linux

  • ■ Dokumentasi komprehensif yang menjelaskan penginstalan, sintaks fungsi, contoh penggunaan, dan referensi ke literatur sumber. Dokumentasi tersedia online di www. EntropyHub. xyz (atau di MattWillFlood. github. io/EntropyHub), yang juga dapat diunduh sebagai buklet (Panduan EntropyHub. pdf). Dokumentasi khusus untuk edisi MATLAB juga dapat ditemukan di bagian 'perangkat lunak tambahan' di browser bantuan MATLAB setelah instalasi. Dokumentasi khusus untuk edisi Julia juga dapat ditemukan di MattWillFlood. github. io/EntropyHub. jl/stabil

  • ■ Menghosting repositori paket asli untuk MATLAB (MathWorks File Exchange), Python (PyPi), dan Julia (Julia General Registry), untuk memfasilitasi pengunduhan, penginstalan, dan pemutakhiran secara langsung. Rilis pengembangan terbaru juga dapat diunduh dari repositori EntropyHub GitHub - www. github. com/MattWillFlood/EntropyHub

Saat ukuran-ukuran baru memasuki alam semesta entropi yang terus berkembang, EntropyHub bertujuan untuk memasukkan ukuran-ukuran ini sesuai dengan itu. EntropyHub dilisensikan di bawah lisensi Apache (versi 2. 0) dan tersedia untuk digunakan oleh semua dengan syarat bahwa makalah ini dikutip pada keluaran ilmiah apa pun yang direalisasikan menggunakan perangkat EntropyHub

Bagian makalah berikut menguraikan isi toolkit, langkah-langkah untuk menginstal dan mengakses dokumentasi

Konten dan fungsi Toolkit

Fungsi dalam toolkit EntropyHub terbagi dalam lima kategori. Tiga kategori pertama—Base, Cross, dan Bidimensional—mengacu pada estimator entropi mandiri yang dibedakan menurut jenis data input yang dianalisis

  • ■ Fungsi dasar mengembalikan entropi dari deret waktu univariat tunggal, mis. g. entropi sampel (SampEn), entropi gelembung (BubbEn), entropi fase (PhasEn), dll

  • ■ Fungsi silang mengembalikan entropi silang antara dua deret waktu univariat, mis. g. cross-fuzzy entropy (XFuzzEn), cross-permutation entropy (XPermEn), dll

  • ■ Fungsi dua dimensi mengembalikan entropi dari matriks data dua dimensi univariat, mis. g. entropi distribusi dua dimensi (DistEn2D), dll

Dua kategori yang tersisa–Multiscale dan Multiscale Cross–berkaitan dengan metode entropi multiskala menggunakan estimator entropi dari masing-masing kategori Basis dan Lintas

  • ■ Fungsi multiskala mengembalikan entropi multiskala dari deret waktu univariat tunggal, yang dihitung menggunakan salah satu penaksir entropi Basis,

    ■ e. g. entropi multiskala (MSEn), entropi multiskala komposit (cMSEn), dll

  • ■ Fungsi Silang Multiskala menampilkan entropi silang multiskala antara dua deret waktu univariat yang dihitung menggunakan salah satu estimator Entropi Silang,

    ■ e. g. entropi lintas multiskala (XMSEn), entropi lintas multiskala halus (rXMSEn), dll

Daftar semua fungsi yang tersedia di versi 0. 1 dari toolkit EntropyHub tersedia di Tabel 2 . Semakin banyak metode entropi yang teridentifikasi, ini akan ditambahkan ke versi toolkit yang lebih baru.

Meja 2

Daftar fungsi lintas-entropi basis, silang, bidimensi, multiskala, dan multiskala tersedia dalam versi 0. 1 dari perangkat EntropyHub

Metode EntropiNama FungsiReferensiFungsi Entropi BasisPerkiraan EntropiApEn[]EntropiPerhatianAttnEn[]Entropi GelembungBubbEn[](diperbaiki) Entropi BersyaratKonden[]Cosinus Kesamaan EntropiCoSiEn[]EntropiDispersiDispEn[, –]Distribusi EntropiDistEn[]Entropi EntropiEnofEn[]EntropiFuzzyEntropiFuzzDistribusi Berakhir[, ]

Buka di jendela terpisah

* Objek entropi multiskala yang dikembalikan oleh fungsi MSobject adalah argumen yang diperlukan untuk fungsi Multiskala dan Silang Multiskala

** Contoh deret waktu dan data gambar dapat diimpor menggunakan fungsi ExampleData. Penggunaan fungsi ini membutuhkan koneksi internet. Data yang diimpor sama dengan yang digunakan dalam contoh yang diberikan dalam dokumentasi EntropyHub

† Berbeda dengan entropi Basis lainnya, entropi spektral (SpecEn) tidak diturunkan dari teori informasi atau teori sistem dinamik, melainkan mengukur entropi spektrum frekuensi

§ Entropi lintas-Kolmogorov dan lintas-spektral, meskipun termasuk dalam perangkat, belum diverifikasi dalam literatur ilmiah

Salah satu keuntungan utama EntropyHub adalah kemampuan untuk menentukan banyak parameter yang digunakan dalam perhitungan entropi dengan memasukkan argumen fungsi kata kunci opsional. Nilai default dari setiap argumen kata kunci didasarkan pada nilai yang diusulkan dalam literatur sumber asli untuk metode tersebut. Namun, menganalisis data deret waktu secara membabi buta menggunakan argumen ini sangat tidak dianjurkan. Menarik kesimpulan tentang data berdasarkan nilai entropi hanya valid ketika parameter yang digunakan untuk menghitung nilai tersebut secara akurat menangkap dinamika data yang mendasarinya

Dengan fungsi Basis dan Silang tertentu, dimungkinkan untuk menghitung entropi menggunakan metode varian penaksir utama. Misalnya, dengan fungsi untuk entropi permutasi (PermEn) seseorang dapat menghitung tepi [], [berbobot], [] sadar amplitudo [], [] dimodifikasi, [] halus [], dan varian entropi permutasi [] kuantisasi seragam, . Penting untuk dicatat bahwa meskipun variabel utama yang dikembalikan oleh setiap fungsi adalah perkiraan nilai entropi, sebagian besar fungsi menyediakan variabel sekunder dan tersier yang mungkin menarik bagi pengguna. Beberapa contoh termasuk fungsi entropi dispersi (DispEn) [] yang juga mengembalikan entropi dispersi balik [], fungsi entropi spektral (SpecEn) [] yang juga mengembalikan entropi pita-spektral [], dan fungsi entropi Kolmogorov (K2En) . Selain itu, setiap fungsi Multiscale dan Multiscale Cross memiliki opsi untuk memplot kurva entropi multiskala (silang) ( Gambar 1 ), serta beberapa fungsi Basis yang . (Figs22 and and33).

Entropi deret waktu python

Buka di jendela terpisah

Gambar 1

Plot representatif dari kurva entropi multiskala yang dikembalikan oleh fungsi entropi Multiskala atau Lintas Multiskala

Kurva yang ditunjukkan sesuai dengan entropi gelembung multiskala dari sinyal derau putih Gaussian (N = 5000, μ = 0, σ = 1), dihitung selama 5 skala waktu berbutir kasar, dengan parameter penaksir. dimensi penyisipan (m) = 2, waktu tunda (τ) = 1

Entropi deret waktu python

Buka di jendela terpisah

Gambar 2

Plot perbedaan orde kedua dikembalikan oleh fungsi fase entropi (PhasEn)

Plot perbedaan orde kedua yang representatif dari komponen x dari himpunan persamaan Henon (α = 1. 4, β = 0. 3), dihitung dengan waktu tunda (τ) = 2 dan partisi (K) = 9

Entropi deret waktu python

Buka di jendela terpisah

Gambar 3

Plot poincaré dan histogram bivariat dikembalikan oleh fungsi entropi distribusi grid (GridEn)

Plot representatif Pioncaré dan histogram bivariat komponen x dari sistem persamaan Lorenz (σ = 10, β = 8/3, ρ = 28), dihitung dengan partisi kisi (m) = 5 dan penundaan waktu (τ)

Instalasi dan dependensi

Rilis versi utama dari toolkit EntropyHub dapat langsung diinstal melalui repositori paket asli untuk lingkungan pemrograman MATLAB, Python, dan Julia. Versi pengembangan beta dapat diunduh dan diinstal dari direktori setiap bahasa pemrograman yang dihosting di repositori EntropyHub GitHub– github. com/MattWillFlood/EntropyHub. EntropyHub kompatibel dengan sistem operasi Windows, Mac dan Linux

MATLAB

Ada dua toolbox tambahan dari keluarga produk MATLAB yang diperlukan untuk mengalami fungsionalitas penuh dari toolkit EntropyHub — Kotak Alat Pemrosesan Sinyal dan Kotak Alat Statistik dan Pembelajaran Mesin. Namun, sebagian besar fungsi akan berfungsi tanpa kotak peralatan ini. EntropyHub dimaksudkan untuk digunakan dengan versi MATLAB ≥ 2016a. Dalam beberapa kasus, toolkit dapat berfungsi pada versi 2015a & 2015b, meskipun tidak disarankan untuk menginstal pada versi MATLAB yang lebih lama dari 2016

Ada dua cara untuk menginstal EntropyHub di MATLAB

Pilihan 1. Catatan. Opsi 1 mengharuskan pengguna untuk masuk ke akun MathWorks mereka

  1. Di aplikasi MATLAB, buka browser Add-Ons di bawah tab 'Home' dengan mengklik 'Get Add-Ons' ()

  2. Di bilah pencarian, cari "EntroypHub" (Gambar S1b)

  3. Buka tautan yang dihasilkan dan klik 'tambahkan' di sudut kanan atas (Gambar S1c)

  4. Ikuti petunjuk untuk menginstal toolbox ()

pilihan 2

  1. Buka direktori 'EntropyHub–MatLab' di repositori EntropyHub di GitHub (). https. //github. com/MattWillFlood/Entropi Hub/tiga/utama/Entropi Hub%20-%20 MatLab

  2. Unduh file kotak alat MATLAB (EntropyHub. file mltbx) ()

  3. Open the MATLAB application and change the current folder to the directory where the EntropyHub. mltbx file is saved ()

  4. Double-click the EntropyHub. mltbx file to open it and click install ()

To check that EntropyHub has been correctly installed, enter “EntropyHub” at the command line and the EntropyHub logo should be displayed ()

Python

There are several modules required to use EntropyHub in Python—NumPy [], SciPy [], Matplotlib [], PyEMD [], and Requests. These modules will be automatically installed alongside EntropyHub if not already installed. EntropyHub was designed using Python3 and thus is not intended for use with Python2 or Python versions < 3. 6. EntropyHub Python functions are primarily built on top of the NumPy module for mathematical computation [], so vector or matrix variables are returned as NumPy array objects

There are 2 ways to install EntropyHub in Python. Option 1 is strongly recommended

Option 1. Note. Option 1 requires the ‘pip’ Python package installer

  • ■ Using pip, enter the following at the command line ()

            pip install EntropyHub

    *Note. this command is case sensitive

pilihan 2

  1. Go to the ‘EntropyHub–Python’ directory in the EntropyHub repository on GitHub (). https. //github. com/MattWillFlood/EntropyHub/tree/main/EntropyHub%20-%20Python

  2. Download the EntropyHub. x. x. x. tar. gz folder and unzip it ()

  3. Open a command prompt (cmd on Windows, terminal on Mac) or the Anaconda prompt if Anaconda is the user’s python package distribution ()

  4. In the command prompt/terminal, navigate to the directory where the EntropyHub. x. x. x. tar. gz folder was saved and extracted ()

  5. Enter the following in the command line ()

            python setup. py install

  6. Ensure that an up-to-date version of the setuptools module is installed

            python -m pip install—upgrade setuptools

Untuk menggunakan EntropyHub, impor modul dengan perintah berikut ()

        import EntropyHub as EH

To check that EntropyHub has been correctly installed and loaded, enter ()

        EH. greet()

Julia

There are a number of modules required to use EntropyHub in Julia—DSP, FFTW, HTTP, DelimitedFiles, Random, Plots, StatsBase, StatsFuns, Statistics, GroupSlices, Combinatorics, Clustering, LinearAlgebra, and Dierckx []. These modules will be automatically installed alongside EntropyHub if not already installed. EntropyHub was designed using Julia 1. 5 and is intended for use with Julia versions ≥ 1. 2

To install EntropyHub in Julia,

  1. In the Julia programming environment, open the package REPL by typing ‘]’ ()

  2. At the command line, enter ()

            add EntropyHub

    *Note. this command is case sensitive

    Alternatively, one can install EntropyHub from the EntropyHub. jl GitHub repository

            add https. //github. com/MattWillFlood/EntropyHub. jl

Untuk menggunakan EntropyHub, impor modul dengan perintah berikut ()

        using EntropyHub

To check that EntropyHub has been correctly installed and loaded, type ()

        EntropyHub. greet()

Supporting documentation and help

To help users to get the most out of EntropyHub, extensive documentation has been developed to cover all aspects of the toolkit, . Included in the documentation are

  • ■ Instructions for installation

  • ■ Thorough descriptions of the application programming interface (API) syntax–function names, keyword arguments, output values, etc

  • ■ References to the original source literature for each method

  • ■ Licensing and terms of use

  • ■ Examples of use

Supporting documentation is available in various formats from the following sources

www. EntropyHub. xyz

The EntropyHub website, www. EntropyHub. xyz (also available at MattWillFlood. github. io/EntropyHub) is the primary source of information on the toolkit with dedicated sections to MATLAB, Python and Julia, as well as release updates and links to helpful internet resources

EntropyHub guide

The EntropyHub Guide. pdf is the toolkit user manual and can be downloaded from the or from the EntropyHub GitHub repository. In addition to the information given on the website, the EntropyHub Guide. pdf document provides some extra material, such as plots of fuzzy functions used for fuzzy entropy (FuzzEn) calculation, or plots of symbolic mapping procedures used in dispersion (DispEn) or symbolic-dynamic entropy (SyDyEn)

MATLAB help browser

Custom built documentation for the MATLAB edition of the toolkit is accessible through the MATLAB help browser after installation. Every function has its own help page featuring several examples of use ranging from basic to advanced. To access this documentation, open the help browser in the MATLAB application and at the bottom of the contents menu on the main page, under ‘Supplemental Software’, click on the link ‘EntropyHub Toolbox’

EntropyHub. jl

Custom documentation for the Julia edition of the toolkit can also be found at MattWillFlood. github. io/EntropyHub. jl (linked to the EntropyHub website). Following Julia package convention, the Julia edition is given the suffix ‘. jl’ and is hosted in a standalone GitHub repository linked to the main EntropyHub repository

Seeking further help

Within each programming environment, information about a specific function can be displayed in the command prompt by accessing the function docstrings. For example, to display information about the approximate entropy function (ApEn), type

  • MATLAB.     help ApEn

  • Python.         help(EntropyHub. ApEn)        (if imported as EntropyHub)

  • Julia.         julia>?                 (to open help mode in the REPL)

  •                         help?> ApEn

Contact

For help with topics not addressed in the documentation, users can seek help by contacting the toolkit developers at zyx. buhyportne@pleh. Every effort will be made to promptly respond to all queries received

To ensure that EntropyHub works as intended, with accurate and robust algorithms at its core, users are encouraged to report any potential bugs or errors discovered. The recommended way to report issues is to place an issue post under the ‘Issues’ tab on the EntropyHub GitHub repository. Doing so allows other users to find answers to common issues and contribute their own solutions. Alternatively, one can notify the package developers of any issues via email to zyx. buhyportne@xif

Continuous integration of new and improved entropy methods into the toolkit is a core principle of the EntropyHub project. Thus, requests and suggestions for new features are welcomed, as are contributions and offers for collaboration. EntropyHub developers will work with collaborators to ensure that contributions are valid, translated into MATLAB, Python and Julia, and follow the formatting adopted throughout the toolkit. Please contact zyx. buhyportne@ofni regarding any proposals that wish to be made

Validation

Included in EntropyHub are a number of sample time series and image datasets which can be used to test the validity of the toolkit functions ( Fig 4 ). Included in these datasets are random number sequences (gaussian, uniform, random integers), chaotic attractors (Lorenz, Hénon), and matrix representations of images (Mandelbrot fractal, random numbers, etc. ). Importing these datasets into the programming environment is done using the ExampleData function ( Table 2 ), which requires an internet connection . Every example presented in the supporting documentation on the EntropyHub website, in the MATLAB help browser, or in the EntropyHub Guide. pdf, employs the same sample datasets provided by the ExampleData function. Therefore, users can replicate these examples verbatim to verify that the toolkit functions properly on their computer system. The following subsections demonstrate the implementation of several Base, Cross-, Bidimensional, Multiscale and Multiscale Cross-entropy methods as a proof-of-principle validation. Note. the examples in the following subsections use MATLAB syntax, but the implementation of these functions and the values they return are the same when using Python and Julia.

Entropi deret waktu python

Buka di jendela terpisah

Fig 4

Sample datasets available with the EntropyHub toolkit through the ExampleData function

(a) A gaussian white noise time series, (b) the Lorenz system of equations, (c) a Mandelbrot fractal

Base entropy

Urutan angka acak yang terdistribusi normal ( Gambar 4A ; N = 5000, rata-rata = 0, SD = 1) diimpor dan didekati . 2*SD[X]).

>> X = ContohData('gaussian');

>> Apen(X)

2. 33505 2. 29926 2. 10113

Urutan bilangan acak menghasilkan nilai entropi yang tinggi karena urutan tersebut memiliki ketidakpastian dan ketidakpastian maksimum. Nilai entropi perkiraan tinggi (> 2) yang dikembalikan dalam contoh ini, sesuai dengan perkiraan untuk dimensi penyematan 0, 1, dan 2, berada dalam kisaran yang diharapkan untuk deret waktu tersebut

Lintas-entropi

Komponen x, y dan z dari sistem persamaan Lorenz ( Gambar 4B ; N = 5917, σ = 10, β = .

>> X = ExampleData(‘lorenz’);

>> XPermEn(X(. ,1. 2))

0. 17771

The Lorenz system is commonly employed in nonlinear dynamics as its attractor exhibits chaotic behaviour. Thus, the low cross-permutation entropy estimate returned here (0. 1771) reflects the high degree of deterministic structure shared between the x and y components of the Lorenz system

Bidimensional entropy

A matrix of normally distributed (Gaussian) random numbers is imported ( Fig 4C ; N = 60x120, mean = 0, SD = 1) and bidimensional dispersion entropy is estimated with a template submatrix size of 5 and all other parameters set to default values (time delay = 1, number of symbols = 3, symbolic mapping transform = normal cumulative distribution function).

>> X = ExampleData(‘gaussian_Mat’);

>> DispEn2D(X, ‘m’, 5)

8. 77894

The high value of the bidimensional dispersion entropy estimate corresponds to those previously reported for Gaussian white noise []

Multiscale entropy

A chirp signal (N = 5000, t0 = 1, t0 = 4000, normalised instantaneous frequency at t0 = 0. 01Hz, instantaneous frequency at t1 = 0. 025Hz) is imported and multiscale sample entropy is estimated over 5 coarse-grained temporal scale using the default parameters (embedding dimension = 2, time delay = 1, threshold = 0. 2*SD[X]). Note. a multiscale entropy object (Mobj) must be used with multiscale entropy functions

>> X = ExampleData(‘chirp’);

>> Mobj = MSobject(‘SampEn’);

>> MSEn(X, Mobj, ’Scales’, 5)

0. 2738 0. 3412 0. 4257 0. 5452 0. 6759

The chirp signal imported in this example represents a swept-frequency cosine with a linearly decreasing period length. The coarse-graining procedure of multiscale entropy [] functions as a low-pass filter of the original time series, with a lower cut-off frequency at each increasing time scale. Therefore, the coarse-graining procedure increasingly diminishes the localised auto-correlation of the chirp signal at each temporal scale, increasing the entropy. Ini mencerminkan peningkatan nilai entropi sampel dari rendah (0. 2738) to moderate (0. 6759) returned by the MSEn function

Multiscale cross-entropy

Two sequences of uniformly distributed random numbers (N = 4096, range = [0, ]) are imported and multiscale cross-distribution entropy is estimated over 7 coarse-grained temporal scales with the default parameters (embedding dimension = 2, time delay = 1, histogram binning method = ‘sturges’, normalisation with respect to number of histogram bins = true)

>> X = ExampleData(‘uniform2’);

>> Mobj = MSobject(‘XDistEn’);

>> XMSEn(X, Mobj)

0. 95735 0. 86769 0. 83544 0. 80433 0. 82617 0. 77619 0. 78893

As expected, the normalised multiscale cross-distribution entropy values remain relatively constant over multiple time scales as no information can be gained about one sequence from the other at any time scale

Discussion

The growing number of entropy methods reported in the scientific literature for time series and image analysis warrants new software tools that enable researchers to apply such methods [, , ]. Currently, there is a dearth of validated, open-source tools that implement a comprehensive array of entropy methods at the command-line with options to modify multiple parameter values. EntropyHub is the first toolkit to provide this functionality in a package that is available in three programming languages (MATLAB, Python, and Julia) with consistent syntax, and is supported by extensive documentation ( Table 3 ). To the best of the Authors knowledge, EntropyHub is also the first toolkit to provide multiple functions for bidimensional entropy [–], multiscale entropy [, , , , ] and multiscale cross-entropy analyses [, , ] all in one package. Specific programming language editions of the EntropyHub toolkit are hosted on the native package repositories for MATLAB, Python and Julia ( Table 3 ), facilitating straightforward installation and version updates. EntropyHub is compatible with both Windows, Mac and Linux operating systems, and is open for use under the Apache License (Version 2. 0) on condition that the present manuscript be cited in any outputs achieved through the use of the toolkit.

Table 3

List of resources for the EntropyHub toolkit

Online ResourcesEntropyHub Websitewww. EntropyHub. xyzMattWillFlood. github. io/EntropyHubGitHub Repositorywww. github. com/MattWillFlood/EntropyHubwww. github. com/MattWillFlood/EntropyHub. jl (Julia only repository)MATLAB Packagewww. mathworks. com/matlabcentral/fileexchange/94185-entropyhubPython Packagepypi. org/project/EntropyHub/Julia Packagejuliahub. com/ui/Packages/EntropyHub/npy5E/0. 1. 0Contact DetailsGeneral Inquirieszyx. buhyportne@ofniHelp and Supportzyx. buhyportne@plehReporting Bugszyx. buhyportne@xif

Buka di jendela terpisah

All information about the toolkit, including installations instructions, documentation, and release updates can be found on the main EntropyHub website. Users can get in touch directly with the package developers by contacting the email addresses provided

The application of entropy in the study of time series data is becoming more common in all manner of research fields such as engineering [, ], medicine [–] and finance [–]. The broad range of entropy functions provided by EntropyHub in multiple programming languages can serve to support researchers in these fields by characterising the uncertainty and complexity of time series data with various stochastic, time-frequency and chaotic properties. Additionally, this is the first toolkit to provide several functions for performing bidimensional (2D) entropy analysis, which can enable users to estimate the entropy of images and matrix data

The goal of EntropyHub is to continually integrate newly developed entropy methods and serve as a cohesive computing resource for all entropy-based analysis, independent of the application or research field. To achieve this goal, suggestions for new features and contributions from other researchers are welcomed

Supporting information

S1 Fig

Instructions for installing EntropyHub in MATLAB

(TIF)

Click here for additional data file. (6. 5M, tif)

S2 Fig

Instructions for installing EntropyHub in Python

(TIF)

Click here for additional data file. (3. 1M, tif)

S3 Fig

Instructions for installing EntropyHub in Julia

(TIF)

Click here for additional data file. (2. 7M, tif)

Acknowledgments

The Authors wish to thank Dr Lara McManus and Ben O’Callaghan for generously donating their time to test the toolkit and provide constructive feedback which substantially improved the end result. The Authors would also like to acknowledge the work of the scientific community in deriving the entropy methods that motivated the development of EntropyHub

Funding Statement

Penelitian ini didanai oleh Luxembourg Institute of Health (https. //www. lih. lu). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript

Data Availability

Tidak ada data yang terkait dengan naskah ini

References

1. Shannon CE. A Mathematical Theory of Communication. Bell Syst Tech J . 1948; 27 . 379–423. doi. 10. 1002/j. 1538-7305. 1948. tb01338. x [CrossRef] [Google Scholar]

2. Li W, Zhao Y, Wang Q, Zhou J. Twenty Years of Entropy Research. A Bibliometric Overview . Entropy . 2019; 21 . 694. doi. 10. 3390/e21070694 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

3. Ribeiro M, Henriques T, Castro L, Souto A, Antunes L, Costa-Santos C, et al. The Entropy Universe . Entropy . 2021; 23 . 222. doi. 10. 3390/e23020222 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

4. Yentes JM, Hunt N, Schmid KK, Kaipust JP, McGrath D, Stergiou N. The appropriate use of approximate entropy and sample entropy with short data sets . Ann Biomed Eng . 2013; 41 . 349–365. doi. 10. 1007/s10439-012-0668-3 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

5. Humeau-Heurtier A, Wu CW, Wu S De. Refined Composite Multiscale Permutation Entropy to Overcome Multiscale Permutation Entropy Length Dependence . IEEE Signal Process Lett . 2015; 22 . 2364–2367. doi. 10. 1109/LSP. 2015. 2482603 [CrossRef] [Google Scholar]

6. Li P, Liu C, Li K, Zheng D, Liu C, Hou Y. Assessing the complexity of short-term heartbeat interval series by distribution entropy . Med Biol Eng Comput . 2015; 53 . 77–87. doi. 10. 1007/s11517-014-1216-0 [PubMed] [CrossRef] [Google Scholar]

7. Cuesta-Frau D, Murillo-Escobar JP, Orrego DA, Delgado-Trejos E. Embedded dimension and time series length. Practical influence on permutation entropy and its applications . Entropy . 2019; 21 . 385. doi. 10. 3390/e21040385 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

8. Rostaghi M, Azami H. Dispersion Entropy. A Measure for Time-Series Analysis . IEEE Signal Process Lett . 2016; 23 . 610–614. doi. 10. 1109/LSP. 2016. 2542881 [CrossRef] [Google Scholar]

9. Xiong W, Faes L, Ivanov PC, Ch Ivanov P. Entropy measures, entropy estimators, and their performance in quantifying complex dynamics. Effects of artifacts, nonstationarity, and long-range correlations . Phys Rev E . 2017; 95 . 62114. doi. 10. 1103/PhysRevE. 95. 062114 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

10. Ramdani S, Bouchara F, Lagarde J. Influence of noise on the sample entropy algorithm . Chaos . 2009; 19 . 13123. doi. 10. 1063/1. 3081406 [PubMed] [CrossRef] [Google Scholar]

11. Azami H, Escudero J. Amplitude-aware permutation entropy. Illustration in spike detection and signal segmentation . Comput Methods Programs Biomed . 2016; 128 . 40–51. doi. 10. 1016/j. cmpb. 2016. 02. 008 [PubMed] [CrossRef] [Google Scholar]

12. Cuesta-Frau D. Permutation entropy. Influence of amplitude information on time series classification performance . Math Biosci Eng . 2019; 16 . 6842–6857. doi. 10. 3934/mbe. 2019342 [PubMed] [CrossRef] [Google Scholar]

13. Chen W, Wang Z, Xie H, Yu W. Characterization of surface EMG signal based on fuzzy entropy . IEEE Trans Neural Syst Rehabil Eng . 2007; 15 . 266–272. doi. 10. 1109/TNSRE. 2007. 897025 [PubMed] [CrossRef] [Google Scholar]

14. Valencia JF, Porta A, Vallverdú M, Clarià F, Baranowski R, Orłowska-Baranowska E, et al. Refined multiscale entropy. Application to 24-h holter recordings of heart period variability in healthy and aortic stenosis subjects . IEEE Trans Biomed Eng . 2009; 56 . 2202–2213. doi. 10. 1109/TBME. 2009. 2021986 [PubMed] [CrossRef] [Google Scholar]

15. Costa M, Goldberger AL, Peng CK. Multiscale Entropy Analysis of Complex Physiologic Time Series . Phys Rev Lett . 2002; 89 . 068102. doi. 10. 1103/PhysRevLett. 89. 068102 [PubMed] [CrossRef] [Google Scholar]

16. Hsu CF, Lin P-YY, Chao H-HH, Hsu L, Chi S. Average Entropy. Measurement of disorder for cardiac RR interval signals . Phys A Stat Mech its Appl . 2019; 529 . 121533. doi. 10. 1016/j. physa. 2019. 121533 [Ref Silang] [Google Cendekia]

17. Li Y, Wang X, Liu Z, Liang X, Si S. The entropy algorithm and its variants in the fault diagnosis of rotating machinery. A review . IEEE Access . 2018; 6 . 66723–66741. doi. 10. 1109/ACCESS. 2018. 2873782 [CrossRef] [Google Scholar]

18. Huo Z, Martinez-Garcia M, Zhang Y, Yan R, Shu L. Entropy Measures in Machine Fault Diagnosis. Insights and Applications . IEEE Trans Instrum Meas . 2020; 69 . 2607–2620. doi. 10. 1109/TIM. 2020. 2981220 [CrossRef] [Google Scholar]

19. Kannathal N, Choo ML, Acharya UR, Sadasivan PK. Entropies for detection of epilepsy in EEG . Program Metode Komputasi Biomed . 2005; 80 . 187–194. doi. 10. 1016/j. cmpb. 2005. 06. 012 [PubMed] [CrossRef] [Google Scholar]

20. Flood MW, Jensen BR, Malling AS, Lowery MM. Increased EMG intermuscular coherence and reduced signal complexity in Parkinson’s disease . Clin Neurophysiol. 2019; 130 . 259–269. doi. 10. 1016/j. clinph. 2018. 10. 023 [PubMed] [CrossRef] [Google Scholar]

21. Abásolo D, Hornero R, Espino P, Poza J, Sánchez CI, De La Rosa R. Analysis of regularity in the EEG background activity of Alzheimer’s disease patients with Approximate Entropy. Clin Neurophysiol. 2005; 116 . 1826–1834. doi. 10. 1016/j. clinph. 2005. 04. 001 [PubMed] [CrossRef] [Google Scholar]

22. Thuraisingham RA, Gottwald GA. On multiscale entropy analysis for physiological data . Phys A Stat Mech its Appl . 2006; 366 . 323–332. doi. 10. 1016/j. physa. 2005. 10. 008 [CrossRef] [Google Scholar]

23. McManus L, Flood MW, Lowery MM. Beta-band motor unit coherence and nonlinear surface EMG features of the first dorsal interosseous muscle vary with force . J Neurophysiol . 2019; 122 . 1147–1162. doi. 10. 1152/jn. 00228. 2019 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

24. Zhou R, Cai R, Tong G. Applications of entropy in finance. A review . Entropy. MDPI AG ; 2013. hal. 4909–4931. doi. 10. 3390/e15114909 [CrossRef] [Google Scholar]

25. Xu M, Shang P, Zhang S. Multiscale analysis of financial time series by Rényi distribution entropy . Phys A Stat Mech its Appl . 2019; 536 . 120916. doi. 10. 1016/j. physa. 2019. 04. 152 [CrossRef] [Google Scholar]

26. Yin Y, Shang P. Modified cross sample entropy and surrogate data analysis method for financial time series . Phys A Stat Mech its Appl . 2015; 433 . 17–25. doi. 10. 1016/j. physa. 2015. 03. 055 [CrossRef] [Google Scholar]

27. Pincus S. Approximate entropy as an irregularity measure for financial data . Econom Rev . 2008; 27 . 329–362. doi. 10. 1080/07474930801959750 [CrossRef] [Google Scholar]

28. Joppa LN, McInerny G, Harper R, Salido L, Takeda K, O’Hara K, et al. Troubling trends in scientific software use . Science. American Association for the Advancement of Science ; 2013. pp. 814–815. doi. 10. 1126/science. 1231535 [PubMed] [CrossRef] [Google Scholar]

29. Piccolo SR, Frampton MB. Alat dan teknik untuk reproduksibilitas komputasi . GigaScience. BioMed Central Ltd. ; 2016. doi. 10. 1186/s13742-016-0135-4 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

30. Kostic MM. The Elusive Nature of Entropy and Its Physical Meaning. Entropy . 2014; 16 . 953–967. doi. 10. 3390/e16020953 [CrossRef] [Google Scholar]

31. Popovic M. Researchers in an Entropy Wonderland. A Review of the Entropy Concept . Therm Sci . 2017; 22 . 1163–1178. Available. http. //arxiv. org/abs/1711. 07326 [Google Scholar]

32. Richman JS, Moorman JR. Physiological time-series analysis using approximate entropy and sample entropy . Am J Physiol Circ Physiol . 2000; 278 . H2039–H2049. doi. 10. 1152/ajpheart. 2000. 278. 6. H2039 [PubMed] [CrossRef] [Google Scholar]

33. Pincus SM. Approximate entropy as a measure of system complexity . Proc Natl Acad Sci . 1991; 88 . 2297–2301. doi. 10. 1073/pnas. 88. 6. 2297 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

34. Xie HB, He WX, Liu H. Measuring time series regularity using nonlinear similarity-based sample entropy . Phys Lett Sect A Gen At Solid State Phys . 2008; 372 . 7140–7146. doi. 10. 1016/j. physleta. 2008. 10. 049 [CrossRef] [Google Scholar]

35. Wu S-D, Wu C-W, Lin S-G, Wang C-C, Lee K-Y. Time Series Analysis Using Composite Multiscale Entropy . Entropy . 2013; 15 . 1069–1084. doi. 10. 3390/e15031069 [CrossRef] [Google Scholar]

36. Wu S De Wu CW, Lin SG Lee KY, Peng CK Analysis of complex time series using refined composite multiscale entropy . Phys Lett Sect A Gen At Solid State Phys . 2014; 378 . 1369–1374. doi. 10. 1016/j. physleta. 2014. 03. 034 [CrossRef] [Google Scholar]

37. Li P. EZ Entropy. A software application for the entropy analysis of physiological time-series . Biomed Eng Online . 2019; 18 . 30. doi. 10. 1186/s12938-019-0650-5 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

38. Mayor D, Panday D, Kandel HK, Steffert T, Banks D. Ceps. An open access matlab graphical user interface (gui) for the analysis of complexity and entropy in physiological signals . Entropy . 2021; 23 . 1–34. doi. 10. 3390/e23030321 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

39. Eduardo Virgilio Silva L, Fazan R Jr, Antonio Marin-Neto J. PyBioS. A freeware computer software for analysis of cardiovascular signals . Comput Methods Programs Biomed . 2020; 197 . 105718. doi. 10. 1016/j. cmpb. 2020. 105718 [PubMed] [CrossRef] [Google Scholar]

40. Jamin A, Humeau-Heurtier A. (Multiscale) Cross-Entropy Methods. A Review . Entropy . 2019; 22 . 45. doi. 10. 3390/e22010045 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

41. Humeau-Heurtier A. Texture feature extraction methods. A survey. IEEE Access. Institute of Electrical and Electronics Engineers Inc. ; 2019. pp. 8975–9000. doi. 10. 1109/ACCESS. 2018. 2890743 [CrossRef] [Google Scholar]

42. Kuntzelman K, Jack Rhodes L, Harrington LN, Miskovic V. A practical comparison of algorithms for the measurement of multiscale entropy in neural time series data . Brain Cogn . 2018; 123 . 126–135. doi. 10. 1016/j. bandc. 2018. 03. 010 [PubMed] [CrossRef] [Google Scholar]

43. Moody GB, Mark RG, Goldberger AL. Physionet. A web-based resource for the study of physiologic signals . IEEE Eng Med Biol Mag . 2001; 20 . 70–75. doi. 10. 1109/51. 932728 [PubMed] [Referensi Silang] [Google Cendekia]

44. Van Rossum G, Drake FL. The python reference manual. 20AD. Available. www. python. org [Google Scholar]

45. Bezanson J, Edelman A, Karpinski S, Shah VB. Julia. A fresh approach to numerical computing . SIAM Rev . 2017; 59 . 65–98. doi. 10. 1137/141000671 [CrossRef] [Google Scholar]

46. Yang J, Choudhary GI, Rahardja S, Franti P. Classification of Interbeat Interval Time-series Using Attention Entropy . IEEE Trans Affect Comput . 2020. doi. 10. 1109/TAFFC. 2017. 2784832 [CrossRef] [Google Scholar]

47. Manis G, Aktaruzzaman M, Sassi R. Bubble entropy. An entropy almost free of parameters . IEEE Trans Biomed Eng . 2017; 64 . 2711–2718. doi. 10. 1109/TBME. 2017. 2664105 [PubMed] [CrossRef] [Google Scholar]

48. Porta A, Baselli G, Lombardi F, Montano N, Malliani A, Cerutti S. Conditional entropy approach for the evaluation of the coupling strength . Biol Cybern . 1999; 81 . 119–129. doi. 10. 1007/s004220050549 [PubMed] [CrossRef] [Google Scholar]

49. Chanwimalueang T, Mandic DP. Cosine Similarity Entropy. Self-Correlation-Based Complexity Analysis of Dynamical Systems . Entropy . 2017; 19 . 652. doi. 10. 3390/e19120652 [CrossRef] [Google Scholar]

50. Li Gao, Wang. Reverse Dispersion Entropy. A New Complexity Measure for Sensor Signal . Sensors. 2019; 19 . 5203. doi. 10. 3390/s19235203 [Artikel gratis PMC] [PubMed] [Referensi Silang] [Google Cendekia]

51. Azami H, Escudero J. Amplitude- and Fluctuation-Based Dispersion Entropy . Entropy . 2018; 20 . 210. doi. 10. 3390/e20030210 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

52. Fu W, Tan J, Xu Y, Wang K, Chen T. Fault Diagnosis for Rolling Bearings Based on Fine-Sorted Dispersion Entropy and SVM Optimized with Mutation SCA-PSO . Entropy . 2019; 21 . 404. doi. 10. 3390/e21040404 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

53. Hsu C, Wei S-Y, Huang H-P, Hsu L, Chi S, Peng C-K. Entropy of Entropy. Measurement of Dynamical Complexity for Biological Systems. Entropy . 2017; 19 . 550. doi. 10. 3390/e19100550 [CrossRef] [Google Scholar]

54. Yan C, Li P, Liu C, Wang X, Yin C, Yao L. Novel gridded descriptors of poincaré plot for analyzing heartbeat interval time-series . Comput Biol Med . 2019; 109 . 280–289. doi. 10. 1016/j. compbiomed. 2019. 04. 015 [PubMed] [CrossRef] [Google Scholar]

55. Yan C, Li P, Ji L, Yao L, Karmakar C, Liu C. Area asymmetry of heart rate variability signal . Biomed Eng Online . 2017; 16 . 112. doi. 10. 1186/s12938-017-0402-3 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

56. Porta A, Casali KR, Casali AG, Gnecchi-Ruscone T, Tobaldini E, Montano N, et al. Asimetri temporal variabilitas periode jantung jangka pendek terkait dengan regulasi otonom . Am J Physiol—Regul Integr Comp Physiol . 2008;295. doi. 10. 1152/ajpregu. 00129. 2008 [PubMed] [CrossRef] [Google Scholar]

57. Karmakar CK, Khandoker AH, Palaniswami M. Phase asymmetry of heart rate variability signal . Physiol Meas . 2015; 36 . 303–314. doi. 10. 1088/0967-3334/36/2/303 [PubMed] [CrossRef] [Google Scholar]

58. Guzik P, Piskorski J, Krauze T, Wykretowicz A, Wysocki H. Heart rate asymmetry by Poincaré plots of RR intervals . Biomedizinische Technik . 2006. hal. 272–275. doi. 10. 1515/BMT. 2006. 054 [PubMed] [CrossRef] [Google Scholar]

59. Liu X, Jiang A, Xu N, Xue J. Increment Entropy as a Measure of Complexity for Time Series . Entropy . 2016; 18 . 22. doi. 10. 3390/e18010022 [CrossRef] [Google Scholar]

60. Liu X, Jiang A, Xu N, Xue J. Correction on Liu X. ; Jiang A. ; Xu N. ; Xue J. Increment Entropy as a Measure of Complexity for Time Series . Entropy 2016, 18 , 22. Entropy. 2016;18. 133. doi. 10. 3390/e18040133 [CrossRef] [Google Scholar]

61. Liu X, Wang X, Zhou X, Jiang A. Appropriate use of the increment entropy for electrophysiological time series. Computers in Biology and Medicine . Elsevier Ltd; 2018. pp. 13–23. doi. 10. 1016/j. compbiomed. 2018. 01. 009 [PubMed] [CrossRef] [Google Scholar]

62. Dünki RM. The estimation of the Kolmogorov entropy from a time series and its limitations when performed on EEG . Bull Math Biol . 1991. doi. 10. 1007/BF02461547 [PubMed] [CrossRef] [Google Scholar]

63. Grassberger P, Procaccia I. Estimation of the Kolmogorov entropy from a chaotic signal . Phys Rev A . 1983; 28 . 2591–2593. doi. 10. 1103/PhysRevA. 28. 2591 [CrossRef] [Google Scholar]

64. Gao L, Wang J, Chen L. Event-related desynchronization and synchronization quantification in motor-related EEG by Kolmogorov entropy . J Neural Eng . 2013; 10 . 036023. doi. 10. 1088/1741-2560/10/3/036023 [PubMed] [CrossRef] [Google Scholar]

65. Huo Z, Zhang Y, Shu L, Liao X. Edge Permutation Entropy. An Improved Entropy Measure for Time-Series Analysis. IECON Proceedings (Industrial Electronics Conference). IEEE Computer Society ; 2019. pp. 5998–6003. doi. 10. 1109/IECON. 2019. 8927449 [CrossRef] [Google Scholar]

66. Bandt C, Pompe B. Permutation Entropy. A Natural Complexity Measure for Time Series . Phys Rev Lett . 2002; 88 . 4. doi. 10. 1103/PhysRevLett. 88. 174102 [PubMed] [CrossRef] [Google Scholar]

67. Xiao-Feng L, Yue W. Fine-grained permutation entropy as a measure of natural complexity for time series . Chinese Phys B . 2009; 18 . 2690–2695. doi. 10. 1088/1674-1056/18/7/011 [CrossRef] [Google Scholar]

68. Bian C, Qin C, Ma QDY, Shen Q. Modified permutation-entropy analysis of heartbeat dynamics . Phys Rev E—Stat Nonlinear, Soft Matter Phys . 2012; 85 . doi. 10. 1103/PhysRevE. 85. 021906 [PubMed] [CrossRef] [Google Scholar]

69. Riedl M, Müller A, Wessel N. Pertimbangan praktis entropi permutasi. Ulasan tutorial . European Physical Journal. Special Topics . 2013. hal. 249–262. doi. 10. 1140/epjst/e2013-01862-7 [Ref Silang] [Google Cendekia]

70. Fadlallah B, Chen B, Keil A, Principe J. Entropi permutasi berbobot. Ukuran kompleksitas untuk deret waktu yang menggabungkan informasi amplitudo . Phys Rev E—Stat Nonlinier, Soft Matter Phys . 2013; 87 . 022911. doi. 10. 1103/PhysRevE. 87. 022911 [PubMed] [CrossRef] [Google Scholar]

71. Chen Z, Li Y, Liang H, Yu J. Improved permutation entropy for measuring complexity of time series under noisy condition. Complexity . 2019; 2019 . doi. 10. 1155/2019/4203158 [CrossRef] [Google Scholar]

72. Rohila A, Sharma A. Phase entropy. A new complexity measure for heart rate variability . Physiol Meas . 2019; 40 . doi. 10. 1088/1361-6579/ab499e [PubMed] [CrossRef] [Google Scholar]

73. Cuesta-Frau D. Slope Entropy. A New Time Series Complexity Estimator Based on Both Symbolic Patterns and Amplitude Information . Entropy . 2019; 21 . 1167. doi. 10. 3390/e21121167 [CrossRef] [Google Scholar]

74. Powell GE, Percival IC. A spectral entropy method for distinguishing regular and irregular motion of Hamiltonian systems . J Phys A Math Gen . 1979; 12 . 2053. doi. 10. 1088/0305-4470/12/11/017 [CrossRef] [Google Scholar]

75. Inouye T, Shinosaki K, Sakamoto H, Toi S, Ukai S, Iyama A, et al. Quantification of EEG irregularity by use of the entropy of the power spectrum . Electroencephalogr Clin Neurophysiol . 1991; 79 . 204–210. doi. 10. 1016/0013-4694(91)90138-t [PubMed] [CrossRef] [Google Scholar]

76. Wang J, Li T, Xie R, Wang XM, Cao YY. Fault feature extraction for multiple electrical faults of aviation electro-mechanical actuator based on symbolic dynamics entropy. In 2015 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC) 2015. Sep19 (pp. 1–6). IEEE. [Google Scholar]

77. Li Y, Yang Y, Li G, Xu M, Huang W. A fault diagnosis scheme for planetary gearboxes using modified multi-scale symbolic dynamic entropy and mRMR feature selection . Mech Syst Signal Process . 2017; 91 . 295–312. doi. 10. 1016/j. ymssp. 2016. 12. 040 [CrossRef] [Google Scholar]

78. Rajagopalan V, Ray A. Symbolic time series analysis via wavelet-based partitioning . Signal Processing. 2006; 86 . 3309–3320. doi. 10. 1016/j. sigpro. 2006. 01. 014 [CrossRef] [Google Scholar]

79. Wang Y, Shang P. Analysis of financial stock markets through the multiscale cross-distribution entropy based on the Tsallis entropy . Nonlinear Dyn . 2018; 94 . 1361–1376. doi. 10. 1007/s11071-018-4429-1 [CrossRef] [Google Scholar]

80. Xie HB, Zheng YP, Guo JY, Chen X. Cross-fuzzy entropy. A new method to test pattern synchrony of bivariate time series . Inf Sci (Ny). 2010; 180 . 1715–1724. doi. 10. 1016/j. ins. 2010. 01. 004 [CrossRef] [Google Scholar]

81. Shi W, Shang P, Lin A. The coupling analysis of stock market indices based on cross-permutation entropy . Nonlinear Dyn . 2015; 79 . 2439–2447. doi. 10. 1007/s11071-014-1823-1 [CrossRef] [Google Scholar]

82. Azami H, Escudero J, Humeau-Heurtier A. Bidimensional Distribution Entropy to Analyze the Irregularity of Small-Sized Textures . IEEE Signal Process Lett . 2017; 24 . 1338–1342. doi. 10. 1109/LSP. 2017. 2723505 [CrossRef] [Google Scholar]

83. Azami H, Virgilio Da Silva E, Omoto ACM, Humeau-Heurtier A. Two-dimensional dispersion entropy. An information-theoretic method for irregularity analysis of images . Komunitas Gambar Proses Sinyal . 2019; 75 . 178–187. doi. 10. 1016/j. gambar. 2019. 04. 013 [CrossRef] [Google Scholar]

84. Hilal M, Gaudencio ASF, Berthin C, Vaz PG, Cardoso J, Martin L, et al. Ukuran Entropi Fuzzy Berwarna Bidimensional. Studi Mikrosirkulasi Kulit. Konferensi Internasional tentang Kemajuan dalam Teknik Biomedis, ICABME. Institute of Electrical and Electronics Engineers Inc. ; 2019. doi. 10. 1109/ICABME47164. 2019. 8940215 [CrossRef] [Google Scholar]

85. Segato dos Santos LF, Neves LA, Rozendo GB, Ribeiro MG, Zanchetta do Nascimento M, Azevedo Tosta TA. Entropi sampel multidimensi dan fuzzy (SampEnMF) untuk mengukur gambar histologi H&E kanker kolorektal . Comput Biol Med . 2018; 103 . 148–160. doi. 10. 1016/j. compbiomed. 2018. 10. 013 [PubMed] [Referensi Silang] [Google Cendekia]

86. Silva LE V, Filho ACSS, Fazan VPS, Felipe JC, Junior LOM. Entropi sampel dua dimensi. menilai tekstur gambar melalui ketidakteraturan . Biomed Phys Eng Express . 2016; 2 . 045002. doi. 10. 1088/2057-1976/2/4/045002 [Ref Silang] [Google Cendekia]

87. Nikulin V V. , Brismar T. Komentar tentang “Analisis Entropi Multiskala dari Deret Waktu Fisiologis Kompleks . Phys Rev Lett . 2004; 92 . doi. 10. 1103/PhysRevLett. 92. 089803 [PubMed] [Referensi Silang] [Google Cendekia]

88. Costa M, Goldberger AL, Peng CK. Costa, Goldberger, dan Peng Balas . Phys Rev Lett . 2004; 92 . doi. 10. 1103/PhysRevLett. 92. 089804 [Ref Silang] [Google Cendekia]

89. Hu M, Liang H. Entropi mode intrinsik berdasarkan dekomposisi mode empiris multivariat dan penerapannya pada analisis data neural . Cogn Neurodyn . 2011; 5 . 277–284. doi. 10. 1007/s11571-011-9159-8 [Artikel gratis PMC] [PubMed] [CrossRef] [Google Cendekia

90. Humeau-Heurtier A. Algoritme entropi multiskala dan variannya. Sebuah ulasan . Entropi . 2015; 17 . 3110–3123. doi. 10. 3390/e17053110 [Referensi Silang] [Google Cendekia]

91. Gao J, Hu J, Tung WW. Ukuran entropi untuk analisis sinyal biologis . Dyn Nonlinier . 2012; 68 . 431–444. doi. 10. 1007/s11071-011-0281-2 [Ref Silang] [Google Cendekia]

92. Castiglioni P, Coruzzi P, Bini M, Parati G, Faini A. Entropi Sampel Multiskala dari Sinyal Kardiovaskular. Apakah Pilihan antara Toleransi Tetap atau Variasi di antara Skala Mempengaruhi Evaluasi dan Interpretasinya? Entropi . 2017; 19 . 590. doi. 10. 3390/e19110590 [Referensi Silang] [Google Cendekia]

93. Pham TD. Analisis Entropi Multiskala Pergeseran Waktu dari Sinyal Fisiologis . Entropi . 2017; 19 . 257. doi. 10. 3390/e19060257 [Referensi Silang] [Google Cendekia]

94. Azami H, Escudero J. Pendekatan butiran kasar dalam sampel multiskala univariat dan entropi dispersi. Entropi . 2018; 20 . 138. doi. 10. 3390/e20020138 [Artikel gratis PMC] [PubMed] [Referensi Silang] [Google Cendekia]

95. Marwaha P, Sunkaria RK. Pemilihan Optimal Nilai Ambang 'r' untuk Entropi Multiskala yang Disempurnakan . Kardiovaskular Eng Technol . 2015; 6 . 557–576. doi. 10. 1007/s13239-015-0242-x [PubMed] [Ref Silang] [Google Cendekia]

96. Jiang Y, Peng CK, Xu Y. Analisis entropi hierarkis untuk sinyal biologis . Journal of Computational and Applied Mathematics . 2011. hal. 728–742. doi. 10. 1016/j. cam. 2011. 06. 007 [Referensi Silang] [Google Cendekia]

97. Yan R, Yang Z, Zhang T. Entropi lintas multiskala. Algoritme baru untuk menganalisis dua deret waktu . Konferensi Internasional ke-5 tentang Komputasi Alam, ICNC 2009. 2009. hal. 411–413. doi. 10. 1109/ICNC. 2009. 118 [Referensi Silang] [Google Cendekia]

98. Wu H-T, Lee C-Y, Liu C-C, Liu A-B. Analisis Entropi Lintas-Perkiraan Multiskala sebagai Pengukuran Kompleksitas antara EKG RR Interval dan Seri Amplitudo Pulsa PPG di antara Subjek Normal dan Diabetes . Metode Matematika Komputasi Med . 2013; 2013 . doi. 10. 1155/2013/231762 [Artikel gratis PMC] [PubMed] [Referensi Silang] [Google Cendekia]

99. Jamin A, Duval G, Annweiler C, Abraham P, Humeau-Heurtier A. Metode Cross-Entropy Multiskala Baru yang Diterapkan pada Data Navigasi yang Diperoleh dengan Simulator Sepeda. IEEE EMBC 2019. hal. 733–736. doi. 10. 1109/EMBC. 2019. 8856815 [PubMed] [Referensi Silang] [Google Cendekia]

100. Costa M, Goldberger AL, Peng CK. Analisis entropi multiskala dari sinyal biologis . Phys Rev E—Stat Nonlinier, Materi Lunak Phys. 2005; 71 . 021906. doi. 10. 1103/PhysRevE. 71. 021906 [PubMed] [Referensi Silang] [Google Cendekia]

101. Yin Y, Shang P, Feng G. Entropi sampel silang multiskala yang dimodifikasi untuk deret waktu kompleks . Komputasi Appl Math . 2016; 289 . 98–110. doi. 10. 1016/j. amc. 2016. 05. 013 [Referensi Silang] [Google Cendekia]

102. Zhang XS, Roy RJ, Jensen EW. Kompleksitas EEG sebagai ukuran kedalaman anestesi untuk pasien . IEEE Trans Biomed Eng . 2001; 48 . 1424–1433. doi. 10. 1109/10. 966601 [PubMed] [Referensi Silang] [Google Cendekia]

103. Harris CR, Millman KJ, van der Walt SJ, Gommers R, Virtanen P, Cournapeau D, dkk. Pemrograman array dengan NumPy . Alam. Penelitian Alam ; . hal. 357–362. doi. 10. 1038/s41586-020-2649-2 [Artikel gratis PMC] [PubMed] [CrossRef] [Google Cendekia

104. Virtanen P, Gommers R, Oliphant TE, Haberland M, Reddy T, Cournapeau D, dkk. SciPy 1. 0. algoritme fundamental untuk komputasi ilmiah di Python. Metode Nat . 2020; 17 . 261–272. doi. 10. 1038/s41592-019-0686-2 [Artikel gratis PMC] [PubMed] [CrossRef] [Google Cendekia

105. Pemburu JD. Matplotlib. Lingkungan grafis 2D . Ilmu Komputer . 2007; 9 . 90–95. doi. 10. 1109/MCSE. 2007. 55 [Ref Silang] [Google Cendekia]

106. Laszuk D. PyEMD. Implementasi Python dari metode Empiris Mode Decompoisition (EMD). [dikutip 10 Juni 2021]. Tersedia. https. //github. com/laszukdawid/PyEMD.

Apa itu entropi deret waktu?

Entropi adalah konsep termodinamika yang mengukur ketidakteraturan molekul dalam sistem tertutup . Konsep ini digunakan dalam sistem dinamik nonlinier untuk mengukur tingkat kompleksitas. Entropi adalah alat yang menarik untuk menganalisis deret waktu, karena tidak mempertimbangkan batasan apa pun pada distribusi probabilitas [7].

Bagaimana cara menghitung entropi dengan Python?

Jika hanya probabilitas pk yang diberikan, entropi Shannon dihitung sebagai H = -sum(pk * log(pk)) . Jika qk tidak ada, maka hitung entropi relatif D = sum(pk * log(pk / qk)).

Bagaimana Anda menghitung entropi spektral?

Untuk menghitung entropi spektral seketika diberikan spektogram daya frekuensi waktu S(t,f), distribusi probabilitas pada waktu t adalah. P ( t , m ) = S ( t , m ) ∑ f S ( t , f ). Maka entropi spektral pada waktu t adalah. H ( t ) = − ∑ m = 1 N P ( t , m ) log 2 P ( t , m ) .

Bagaimana Anda menginstal entropi?

Instalasi .
1. PIP. $ pip instal entropi
2. Dari sumber. .
Beri tahu entropi bagaimana hari ini berjalan. .
Tanyakan entropi tentang bagaimana hari ini, kemarin, minggu, bulan, tahun, dan hari ini. .
Jurnal dengan entropi setiap pagi. .
Minta entropi untuk mengambil jurnal lama untuk Anda. .
Jika Anda butuh bantuan, jangan malu