indicate if a determined area has or not high probability of forest fi translation - indicate if a determined area has or not high probability of forest fi Indonesian how to say

indicate if a determined area has o

indicate if a determined area has or not high probability of forest fire, the selected samples
should include all the different situations found in the study area. The situations accounted
were:
A- Agricultural areas in 2004 that were still agricultural areas in 2005.
B- Areas burned in 2004 and became agricultural areas in 2005.
C- Forest areas in 2004 that were burned in 2005.
D- Areas burned in 2004 and burned again in 2005.
E- Forest areas in 2004 that continued untouched areas in 2005.
Nine different ANNs architectures were tested, being the basic structure consisting of the
input layer, one hidden layer and one output layer. The number of neurons tested in the
hidden layer was 4, 6, 8, 10, 12, 14, 16, 18 and 20. The output layer was fixed in one neuron
with a logarithmic-sigmoid transfer function, which means that it generates output values
between 0 and 1, as the inputs can range from negative to positive infinity. Therefore, it was
pre-defined that output values next to 1 would indicate areas with high probability of forest
fire, while values next to 0, areas with low or none fire probability.
The ANNs were trained with the Levenberg-Marquardt algorithm (Levenberg, 1944;
Marquardt, 1963), which application to neural network training is described in Hagan and
Menhaj (1994). One of the problems commonly faced on ANN training is the overfitting
(German et al., 1992). The overfitting happens when the training error is set to a very small
value, however, when new records are presented to the ANN the error becomes much higher,
which means that the network memorized the training samples, but was not able to recognize
unseen samples. To avoid the overfitting, the networks were trained using the early stopping
technique (Prechelt, 1998). In this technique the total samples available were divided to train
(50%), to validate (25%) and to test (25%) the ANNs. While the network is trained, the error
of the validation set is monitored. If the network starts to overfit the training samples, the
validation error starts to increase, as the train error continues to decrease. So when the
validation error starts to increase successively, the train process is interrupted.
0/5000
From: -
To: -
Results (Indonesian) 1: [Copy]
Copied!
indicate if a determined area has or not high probability of forest fire, the selected samples should include all the different situations found in the study area. The situations accounted were:A- Agricultural areas in 2004 that were still agricultural areas in 2005.B- Areas burned in 2004 and became agricultural areas in 2005.C- Forest areas in 2004 that were burned in 2005.D- Areas burned in 2004 and burned again in 2005.E- Forest areas in 2004 that continued untouched areas in 2005.Nine different ANNs architectures were tested, being the basic structure consisting of the input layer, one hidden layer and one output layer. The number of neurons tested in the hidden layer was 4, 6, 8, 10, 12, 14, 16, 18 and 20. The output layer was fixed in one neuron with a logarithmic-sigmoid transfer function, which means that it generates output values between 0 and 1, as the inputs can range from negative to positive infinity. Therefore, it was pre-defined that output values next to 1 would indicate areas with high probability of forest fire, while values next to 0, areas with low or none fire probability.The ANNs were trained with the Levenberg-Marquardt algorithm (Levenberg, 1944; Marquardt, 1963), which application to neural network training is described in Hagan and Menhaj (1994). One of the problems commonly faced on ANN training is the overfitting (German et al., 1992). The overfitting happens when the training error is set to a very small nilai, namun, ketika catatan baru disajikan dengan ANN kesalahan menjadi jauh lebih tinggi, yang berarti bahwa jaringan hafal sampel pelatihan, tetapi itu tidak mampu mengenali sampel yang gaib. Untuk menghindari overfitting, Jaringan dilatih menggunakan berhenti awal Teknik (Prechelt, 1998). Dalam teknik ini total sampel tersedia dibagi untuk melatih (50%), untuk memvalidasi (25%) dan untuk menguji ANNs (25%). Sementara jaringan terlatih, kesalahan validasi set dimonitor. Jika jaringan mulai overfit sampel pelatihan, kesalahan validasi mulai meningkat, seperti kesalahan kereta terus menurun. Jadi ketika kesalahan validasi mulai meningkat berturut-turut, proses kereta terganggu.
Being translated, please wait..
Results (Indonesian) 2:[Copy]
Copied!
menunjukkan jika daerah bertekad memiliki atau tidak probabilitas tinggi kebakaran hutan, sampel yang dipilih
harus mencakup semua situasi yang berbeda ditemukan di daerah penelitian. Situasi menyumbang
adalah:
daerah pertanian A- tahun 2004 yang masih daerah pertanian pada tahun 2005.
B- Area terbakar pada tahun 2004 dan menjadi daerah pertanian pada tahun 2005.
daerah C- Forest pada tahun 2004 yang dibakar pada tahun 2005.
D- Area terbakar pada tahun 2004 dan dibakar lagi pada tahun 2005.
daerah E- Forest pada tahun 2004 yang berlanjut daerah tersentuh pada tahun 2005.
Sembilan ANNs berbeda arsitektur diuji, menjadi struktur dasar yang terdiri dari
lapisan input, satu lapisan tersembunyi dan satu lapisan output. Jumlah neuron diuji dalam
lapisan tersembunyi adalah 4, 6, 8, 10, 12, 14, 16, 18 dan 20. lapisan output tetap dalam satu neuron
dengan fungsi transfer logaritmik-sigmoid, yang berarti bahwa itu menghasilkan keluaran nilai
antara 0 dan 1, sebagai masukan dapat berkisar dari negatif ke positif tak terhingga. Oleh karena itu,
pra-didefinisikan bahwa nilai-nilai output berikutnya untuk 1 akan menunjukkan daerah dengan probabilitas tinggi dari hutan
api, sementara nilai-nilai di sebelah 0, daerah dengan probabilitas rendah atau kebakaran tidak.
Para ANNs dilatih dengan algoritma Levenberg-Marquardt (Levenberg, 1944;
Marquardt, 1963), yang aplikasi untuk pelatihan jaringan saraf dijelaskan dalam Hagan dan
Menhaj (1994). Salah satu masalah umum yang dihadapi pada pelatihan JST adalah overfitting
(Jerman et al., 1992). Overfitting yang terjadi ketika kesalahan pelatihan diatur untuk sangat kecil
nilai, namun, ketika rekor baru disajikan dengan JST kesalahan menjadi jauh lebih tinggi,
yang berarti bahwa jaringan hafal sampel pelatihan, tetapi tidak mampu mengenali
sampel tak terlihat. Untuk menghindari overfitting, jaringan dilatih menggunakan berhenti awal
teknik (Prechelt, 1998). Dalam teknik ini total sampel yang tersedia dibagi untuk melatih
(50%), untuk memvalidasi (25%) dan untuk menguji (25%) yang ANNs. Sementara jaringan dilatih, kesalahan
dari himpunan validasi dipantau. Jika jaringan mulai overfit sampel pelatihan,
kesalahan validasi mulai meningkat, seperti kesalahan kereta terus menurun. Jadi ketika
kesalahan validasi mulai meningkat berturut-turut, proses kereta terganggu.
Being translated, please wait..
 
Other languages
The translation tool support: Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bosnian, Bulgarian, Catalan, Cebuano, Chichewa, Chinese, Chinese Traditional, Corsican, Croatian, Czech, Danish, Detect language, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Frisian, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hausa, Hawaiian, Hebrew, Hindi, Hmong, Hungarian, Icelandic, Igbo, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Kinyarwanda, Klingon, Korean, Kurdish (Kurmanji), Kyrgyz, Lao, Latin, Latvian, Lithuanian, Luxembourgish, Macedonian, Malagasy, Malay, Malayalam, Maltese, Maori, Marathi, Mongolian, Myanmar (Burmese), Nepali, Norwegian, Odia (Oriya), Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Samoan, Scots Gaelic, Serbian, Sesotho, Shona, Sindhi, Sinhala, Slovak, Slovenian, Somali, Spanish, Sundanese, Swahili, Swedish, Tajik, Tamil, Tatar, Telugu, Thai, Turkish, Turkmen, Ukrainian, Urdu, Uyghur, Uzbek, Vietnamese, Welsh, Xhosa, Yiddish, Yoruba, Zulu, Language translation.

Copyright ©2025 I Love Translation. All reserved.

E-mail: