www.wikidata.uk-ua.nina.az
Shtu chni nejro nni mere zhi ShNM angl artificial neural network yaki zazvichaj prosto nazivayut nejronnimi merezhami NM angl neural networks NN abo nejromerezhami angl neural nets 1 ce obchislyuvalni sistemi nathneni biologichnimi nejronnimi merezhami yaki skladayut mozok tvarin 2 Shtuchna nejronna merezha ce vzayemopov yazana grupa vuzliv nathnena sproshennyam nejroniv mozku Tut kozhna krugla vershina podaye shtuchnij nejron a strilka podaye z yednannya vihodu odnogo shtuchnogo nejrona z vhodom inshogo ShNM gruntuyetsya na sukupnosti z yednanih vuzliv angl units nodes yaki nazivayut shtuchnimi nejronami yaki priblizno modelyuyut nejroni biologichnogo mozku Kozhne z yednannya yak i sinapsi v biologichnomu mozku mozhe peredavati signal do inshih nejroniv Shtuchnij nejron otrimuye signali potim obroblyaye yih i mozhe signalizuvati nejronam z yakimi jogo z yednano Signal u z yednanni ce dijsne chislo a vihid kozhnogo nejrona obchislyuyetsya deyakoyu nelinijnoyu funkciyeyu sumi jogo vhodiv Z yednannya nazivayut rebrami angl edges Nejroni ta rebra zazvichaj mayut vagu en angl weight yaka pidlashtovuyetsya v procesi navchannya Vaga zbilshuye abo zmenshuye silu signalu na z yednanni Nejroni mozhut mati takij porig sho signal nadsilayetsya lishe todi koli sukupnij signal perevishuye cej porig Yak pravilo nejroni zibrano v shari angl layers Rizni shari mozhut vikonuvati rizni peretvorennya danih svogo vhodu Signali prohodyat vid pershogo sharu sharu vhodu do ostannogo sharu vihodu mozhlivo pislya prohodzhennya sharami dekilka raziv Zmist 1 Trenuvannya 2 Istoriya 3 Modeli 3 1 Shtuchni nejroni 3 2 Budova 3 3 Giperparametr 3 4 Navchannya 3 4 1 Temp navchannya 3 4 2 Funkciya vitrat 3 4 3 Zvorotne poshirennya 3 5 Paradigmi navchannya 3 5 1 Kerovane navchannya 3 5 2 Nekerovane navchannya 3 5 3 Navchannya z pidkriplennyam 3 5 4 Samonavchannya 3 5 5 Nejroevolyuciya 3 6 Stohastichna nejronna merezha 3 7 Inshi 3 7 1 Rezhimi 4 Tipi 5 Pobudova merezh 6 Vikoristannya 7 Zastosuvannya 8 Teoretichni vlastivosti 8 1 Obchislyuvalna potuzhnist 8 2 Yemnist 8 3 Zbizhnist 8 4 Uzagalnyuvalnist ta statistika 9 Kritika 9 1 Trenuvannya 9 2 Teoriya 9 3 Aparatne zabezpechennya 9 4 Praktichni kontrprikladi 9 5 Gibridni pidhodi 10 Galereya 11 Div takozh 12 Vinoski 13 Primitki 14 LiteraturaTrenuvannya red Nejronni merezhi navchayutsya abo yih trenuyut shlyahom obrobki prikladiv kozhen z yakih mistit vidomij vhid ta rezultat utvoryuyuchi jmovirnisno zvazheni asociaciyi mizh nimi yaki zberigayutsya v strukturi danih samoyi merezhi Trenuvannya nejronnoyi merezhi zadanim prikladom zazvichaj zdijsnyuyut shlyahom viznachennya riznici mizh obroblenim vihodom merezhi chasto peredbachennyam i cilovim vihodom Cya riznicya ye pohibkoyu Potim merezha pidlashtovuye svoyi zvazheni asociaciyi vidpovidno do pravila navchannya i z vikoristannyam cogo znachennya pohibki Poslidovni pidlashtovuvannya prizvedut do viroblyannya nejronnoyu merezheyu rezultativ use bilshe shozhih na cilovi Pislya dostatnoyi kilkosti cih pidlashtovuvan trenuvannya mozhlivo pripiniti na osnovi pevnogo kriteriyu Ce forma kerovanogo navchannya Taki sistemi navchayutsya vikonuvati zavdannya rozglyadayuchi prikladi yak pravilo bez programuvannya pravil dlya konkretnih zavdan Napriklad u rozpiznavanni zobrazhen voni mozhut navchitisya vstanovlyuvati zobrazhennya na yakih zobrazheni koti analizuyuchi prikladi zobrazhen micheni en vruchnu yak kit ta ne kit i vikoristovuyuchi rezultati dlya identifikuvannya kotiv na inshih zobrazhennyah Voni roblyat ce bez bud yakogo apriornogo znannya pro kotiv napriklad sho voni mayut hutro hvosti vusa ta kotopodibni piski Natomist voni avtomatichno porodzhuyut identifikacijni harakteristiki z prikladiv yaki obroblyuyut Istoriya red Dokladnishe Istoriya shtuchnih nejronnih merezh en Najprostishij tip nejronnoyi merezhi pryamogo poshirennya NMPP angl feedforward neural network FNN ce linijna merezha yaka skladayetsya z yedinogo sharu vuzliv vihodu vhodi podayutsya bezposeredno na vihodi cherez nizku vag V kozhnomu vuzli obchislyuyetsya suma dobutkiv vag ta danih vhodiv Serednokvadratichni pohibki mizh cimi obchislenimi vihodami ta zadanimi cilovimi znachennyami minimizuyut shlyahom pidlashtovuvannya vag Cej metod vidomij ponad dva stolittya yak metod najmenshih kvadrativ abo linijna regresiya Lezhandr 1805 ta Gauss 1795 vikoristovuvali jogo yak zasib dlya znahodzhennya dobrogo grubogo linijnogo dopasuvannya do naboru tochok dlya peredbachuvannya ruhu planet 3 4 5 6 7 Vilgelm Lenc en ta Ernst Izing en stvorili ta proanalizuvali model Izinga 1925 8 yaka po suti ye shtuchnoyu rekurentnoyu nejronnoyu merezheyu RNM angl recurrent neural network RNN bez navchannya sho skladayetsya z nejronopodibnih porogovih elementiv 6 1972 roku Sun yiti Amari en zrobiv cyu arhitekturu adaptivnoyu 9 6 Jogo navchannya RNM populyarizuvav Dzhon Gopfild 1982 roku 10 Vorren Makkaloh ta Volter Pitts en 11 1943 takozh rozglyadali nenavchanu obchislyuvalnu model dlya nejronnih merezh 12 Naprikinci 1940 h rokiv D O Gebb 13 stvoriv gipotezu navchannya zasnovanu na mehanizmi nejroplastichnosti sho stala vidomoyu yak gebbove navchannya angl Hebbian learning Farli ta Vesli A Klark en 14 1954 vpershe vikoristali obchislyuvalni mashini zvani todi kalkulyatorami dlya modelyuvannya gebbovoyi merezhi 1958 roku psiholog Frenk Rozenblat vinajshov perceptron angl perceptron pershu vtilenu shtuchnu nejronnu merezhu 15 16 17 18 finansovanu Upravlinnyam vijskovo morskih doslidzhen en SShA 19 Dehto kazhe sho doslidzhennya zaznali zastoyu pislya togo yak Minski ta Pejpert 1969 20 viyavili sho bazovi perceptroni ne zdatni obroblyati shemu viklyuchnogo abo i sho komp yuteram brakuye dostatnoyi potuzhnosti dlya obrobki pridatnih nejronnih merezh Prote na moment vihodu ciyeyi knigi vzhe buli vidomi metodi trenuvannya bagatosharovih perceptroniv BShP angl multilayer perceptron MLP Pershij BShP glibokogo navchannya opublikuvali Oleksij Grigorovich Ivahnenko ta Valentin Lapa 1965 roku pid nazvoyu metod grupovogo urahuvannya argumentiv angl Group Method of Data Handling 21 22 23 Pershij BShP glibokogo navchannya navchenij stohastichnim gradiyentnim spuskom 24 opublikuvav 1967 roku Sun yiti Amari en 25 U komp yuternih eksperimentah provedenih uchnem Amari Sajto p yatisharovij BShP iz dvoma zminyuvanimi sharami navchivsya korisnih vnutrishnih podan dlya klasifikuvannya nelinijno rozdilnih klasiv obraziv 6 Samoorganizacijni karti angl self organizing maps SOM opisav Teuvo Kohonen 1982 roku 26 27 Samoorganizacijni karti ce nejrofiziologichno nathneni 28 nejronni merezhi yaki navchayutsya nizkovimirnogo podannya visokovimirnih danih zberigayuchi pri comu topologichnu strukturu cih danih Voni trenuyutsya za dopomogoyu konkurentnogo navchannya 26 Arhitekturu zgortkovoyi nejronnoyi merezhi ZNM angl convolutional neural network CNN zi zgortkovimi sharami ta sharami ponizhennya diskretizaciyi zaproponuvav Kunihiko Fukusima en 1980 roku 29 Vin nazvav yiyi neokognitronom angl neocognitron 1969 roku vin takozh zaproponuvav peredavalnu funkciyu ReLU angl rectified linear unit vipryamlenij linijnij vuzol 30 Cej vipryamlyach stav najpopulyarnishoyu peredavalnoyu funkciyeyu dlya ZNM ta glibokih nejronnih merezh zagalom 31 ZNM stali vazhlivim instrumentom komp yuternogo bachennya Algoritm zvorotnogo poshirennya angl backpropagation ce efektivne zastosuvannya lancyugovogo pravila Lejbnica 1673 32 do merezh diferencijovnih vuzliv Vin takozh vidomij yak zvorotnij rezhim avtomatichnogo diferenciyuvannya abo zvorotne nakopichennya en zavdyaki Seppo Linnainmaa en 1970 33 34 35 36 6 Termin pohibki zvorotnogo poshirennya angl back propagating errors zaprovadiv 1962 roku Frenk Rozenblat 37 6 ale vin ne mav vtilennya ciyeyi proceduri hocha Genri Kelli en 38 ta Brajson en 39 mali bezperervni poperedniki zvorotnogo poshirennya na osnovi dinamichnogo programuvannya 21 40 41 42 vzhe v 1960 61 rokah u konteksti teoriyi keruvannya 6 1973 roku Drejfus vikoristav zvorotne poshirennya dlya pristosovuvannya parametriv kontroleriv proporcijno gradiyentam pohibok 43 1982 roku Pol Verbos en zastosuvav zvorotne poshirennya do BShP u sposib yakij stav standartnim 44 40 1986 roku Rumelhart en Ginton ta Vilyams en pokazali sho zvorotne poshirennya navchayetsya cikavih vnutrishnih podan sliv yak vektoriv oznak koli trenuyetsya peredbachuvati nastupne slovo v poslidovnosti 45 Nejronna merezha z chasovoyu zatrimkoyu angl time delay neural network TDNN Aleksa Vajbelya en 1987 poyednala zgortki spilni vagi ta zvorotne poshirennya 46 47 1988 roku Vej Chzhan zi spivavt zastosovuvali zvorotne poshirennya do ZNM sproshenogo neokognitrona zi zgortkovimi vzayemozv yazkami mizh sharami oznak zobrazhennya ta ostannim povnozv yaznim sharom dlya abetkovogo rozpiznavannya 48 49 1989 roku Yan Lekun zi spivavt navchili ZNM rozpiznavati rukopisni poshtovi indeksi na poshti 50 1992 roku Dzhuan Veng zi spivavt zaproponuvali maksimizuvalne agreguvannya angl max pooling dlya ZNM shobi dopomogti z invariantnistyu shodo najmenshogo zsuvu ta tolerantnistyu do deformuvannya dlya spriyannya rozpiznavannyu trivimirnih ob yektiv en 51 52 53 LeNet 5 1998 7 rivnevu ZNM vid Yana Lekuna zi spivavt 54 yaka klasifikuye cifri bulo zastosovano kilkoma bankami dlya rozpiznavannya rukopisnih chisel na chekah ocifrovanih u zobrazhennya 32 32 pikseliv Pochinayuchi z 1988 roku 55 56 vikoristannya nejronnih merezh peretvorilo galuz peredbachuvannya struktur bilkiv zokrema koli pershi kaskadni merezhi trenuvalisya na profilyah matricyah stvorenih chislennimi virivnyuvannyami poslidovnostej 57 U 1980 h rokah zvorotne poshirennya ne pracyuvalo dobre dlya glibokih NMPP ta RNM Shobi podolati cyu problemu Yurgen Shmidhuber 1992 zaproponuvav iyerarhiyu RNM poperedno trenovanih po odnomu rivnyu samokerovanim navchannyam 58 59 Vona vikoristovuye peredbachuvalne koduvannya en dlya navchannya vnutrishnih podan u kilkoh samoorganizovanih masshtabah chasu Ce mozhe istotno polegshuvati podalshe gliboke navchannya Cyu iyerarhiyu RNM mozhlivo zgornuti angl collapse v yedinu RNM shlyahom distilyuvannya en fragmentuvalnoyi angl chunker merezhi vishogo rivnya v avtomatizuvalnu angl automatizer merezhu nizhchogo rivnya 58 6 1993 roku fragmentuvalnik rozv yazav zavdannya glibokogo navchannya glibina yakogo perevishuvala 1000 60 1992 roku Yurgen Shmidhuber takozh opublikuvav alternativu RNM angl alternative to RNNs 61 yaku zaraz nazivayut linijnim transformerom angl linear Transformer abo transformerom z linearizovanoyu samouvagoyu 62 63 6 za vinyatkom operatora normuvannya Vin navchayetsya vnutrishnih centriv uvagi angl internal spotlights of attention 64 povilna nejronna merezha pryamogo poshirennya vchitsya za dopomogoyu gradiyentnogo spusku keruvati shvidkimi vagami inshoyi nejronnoyi merezhi cherez tenzorni dobutki samoporodzhuvanih shabloniv zbudzhennya FROM i TO zvanih teper klyuchem angl key ta znachennyam angl value samouvagi 62 Ce vidobrazhennya uvagi angl attention mapping shvidkih vag zastosovuyut do shablonu zapitu Suchasnij transformer angl Transformer zaproponuvali Ashish Vasvani zi spivavt u svoyij praci 2017 roku Uvaga ce vse sho vam treba 65 Vin poyednuye ce z operatorom softmax ta proyekcijnoyu matriceyu 6 Transformeri vse chastishe obirayut za model dlya obrobki prirodnoyi movi 66 Bagato suchasnih velikih movnih modelej takih yak ChatGPT GPT 4 ta BERT vikoristovuyut same jogo Transformeri takozh vse chastishe vikoristovuyut u komp yuternim bachenni 67 1991 roku Yurgen Shmidhuber takozh opublikuvav zmagalni nejronni merezhi angl adversarial neural networks yaki zmagayutsya mizh soboyu u formi antagonistichnoyi gri de vigrash odniyeyi merezhi ye prograshem inshoyi 68 69 70 Persha merezha ye porodzhuvalnoyu modellyu yaka modelyuye rozpodil imovirnosti nad obrazami na vihodi Druga merezha navchayetsya gradiyentnim spuskom peredbachuvati reakciyu seredovisha na ci obrazi Ce bulo nazvano shtuchnoyu cikavistyu angl artificial curiosity 2014 roku Yan Gudfelou zi spivavt vikoristali cej princip u porodzhuvalnij zmagalnij merezhi angl generative adversarial network GAN 71 Tut reakciya navkolishnogo seredovisha dorivnyuye 1 abo 0 zalezhno vid togo chi nalezhit vihid pershoyi merezhi do zadanogo naboru Ce mozhlivo vikoristovuvati dlya stvorennya realistichnih dipfejkiv 72 Vidminnoyi yakosti zobrazhennya dosyagla StyleGAN en Nvidia 2018 73 na osnovi progresivnoyi porodzhuvalnoyi zmagalnoyi merezhi angl Progressive GAN Tero Karrasa Timo Ajli Samuli Lajne ta Yaakko Lehtinena 74 Tut porodzhuvach viroshuyetsya vid malogo do velikogo piramidnim chinom Diplomnu pracyu Zeppa Hohrajtera en 1991 75 jogo kerivnik Yurgen Shmidhuber nazvav odnim iz najvazhlivishih dokumentiv v istoriyi mashinnogo navchannya 6 Hohrajter viznachiv i proanalizuvav problemu znikannya gradiyentu 75 76 j zaproponuvav dlya yiyi rozv yazannya rekurentni zalishkovi z yednannya Ce prizvelo do poyavi metodu glibokogo navchannya zvanogo dovgoyu korotkochasnoyu pam yattyu DKChP angl long short term memory LSTM opublikovanogo v Neural Computation 1997 77 Rekurentni nejronni merezhi DKChP mozhut navchatisya zadach duzhe glibokogo navchannya angl very deep learning 78 z dovgimi shlyahami rozpodilu vnesku yaki vimagayut spogadiv pro podiyi sho vidbulisya za tisyachi diskretnih chasovih krokiv do cogo Standartnu DKChP angl vanilla LSTM iz zabuvalnim ventilem zaproponuvali 1999 roku Feliks Gers en Shmidhuber ta Fred Kammins 79 DKChP stala najcitovanishoyu nejronnoyu merezheyu XX stolittya 6 2015 roku Rupesh Kumar Shrivastava Klaus Greff i Shmidhuber vikoristali princip DKChP dlya stvorennya magistralevoyi merezhi angl Highway network nejronnoyi merezhi pryamogo poshirennya z sotnyami shariv nabagato glibshoyi za poperedni 80 81 7 misyaciv potomu Kajmin He Syan yu Chzhan Shaocin Ren ta Czyan Sun vigrali zmagannya ImageNet en 2015 roku z vidkritoventilnim abo bezventilnim variantom magistralevoyi merezhi nazvanim zalishkovoyu nejronnoyu merezheyu angl Residual neural network 82 Vona stala najcitovanishoyu nejronnoyu merezheyu XXI stolittya 6 Rozvitok metal oksid napivprovidnikovih MON shem nadvisokogo rivnya integraciyi NVIS u formi tehnologiyi komplementarnih MON KMON dozvoliv zbilshiti kilkist en MON tranzistoriv u cifrovij elektronici Ce zabezpechilo bilshu potuzhnist obrobki dlya rozrobki praktichnih shtuchnih nejronnih merezh u 1980 h rokah 83 Do rannih uspihiv nejronnih merezh nalezhali prognozuvannya fondovogo rinku a 1995 roku perevazhno bezpilotnij avtomobil a 84 Dzhefri Ginton zi spivavt 2006 zaproponuvali navchannya visokorivnevih podan z vikoristannyam poslidovnih shariv dvijkovih abo dijsnoznachnih latentnih zminnih z obmezhenoyu mashinoyu Bolcmana 85 dlya modelyuvannya kozhnogo sharu 2012 roku In ta Din stvorili merezhu yaka navchilasya rozpiznavati ponyattya vishogo rivnya taki yak koti lishe pereglyadayuchi nemicheni zobrazhennya 86 Poperednye nekerovane trenuvannya ta zbilshennya obchislyuvalnoyi potuzhnosti GP ta rozpodilenih obchislen dozvolili vikoristovuvati bilshi merezhi zokrema v zadachah rozpiznavannya zobrazhen i bachennya yaki stali vidomi yak gliboke navchannya 87 Chireshan iz kolegami 2010 88 pokazali sho nezvazhayuchi na problemu znikannya gradiyenta GP roblyat zvorotne poshirennya pridatnim dlya bagatosharovih nejronnih merezh pryamogo poshirennya 89 U period mizh 2009 ta 2012 rokami ShNM pochali vigravati nagorodi v konkursah iz rozpiznavannya zobrazhen nablizhayuchis do lyudskogo rivnya vikonannya riznih zavdan spochatku v rozpiznavanni obraziv ta rozpiznavanni rukopisnogo tekstu 90 91 Napriklad dvospryamovana ta bagatovimirna dovga korotkochasna pam yat DKChP 92 93 Grejvsa en zi spivavt vigrala tri zmagannya z rozpiznavannya zv yazanogo rukopisnogo tekstu 2009 roku bez bud yakih poperednih znan pro tri movi yakih potribno bulo navchitisya 92 93 Chireshan iz kolegami stvorili pershi rozpiznavachi obraziv yaki dosyagli lyudskoyi nadlyudskoyi produktivnosti 94 na takih perevirkah yak rozpiznavannya dorozhnih znakiv IJCNN 2012 Modeli red Cej rozdil mozhe buti plutanim abo neyasnim en dlya chitachiv Bud laska dopomozhit proyasniti cej rozdil en Mozhlivo storinka obgovorennya mistit zauvazhennya shodo potribnih zmin sichen 2018 Dokladnishe Matematika shtuchnih nejronnih merezh nbsp Nejron i miyelinovanij akson iz potokom signalu vid vhodiv na dendritah do vihodiv na terminalah aksonaShNM pochalisya yak sproba vikoristati arhitekturu lyudskogo mozku dlya vikonannya zavdan u yakih zvichajni algoritmi mali nevelikij uspih Nezabarom voni pereoriyentuvalisya na pokrashennya empirichnih rezultativ vidmovivshis vid sprob zalishatisya virnimi svoyim biologichnim poperednikam ShNM mayut zdatnist navchatisya nelinijnostej ta skladnih zv yazkiv ta modelyuvati yih Ce dosyagayetsya tim sho nejroni z yednuyutsya za riznimi shemami sho dozvolyaye vihodam odnih nejroniv stati vhodom inshih Cya merezha utvoryuye oriyentovanij zvazhenij graf 95 Shtuchna nejronna merezha skladayetsya z imitacij nejroniv Kozhen nejron z yednano z inshimi vuzlami angl nodes lankami angl links yak biologichne z yednannya akson sinaps dendrit Usi vuzli z yednani lankami otrimuyut deyaki dani j vikoristovuyut yih dlya vikonannya pevnih operacij i zavdan z danimi Kozhna lanka maye vagu angl weight sho viznachaye silu vplivu odnogo vuzla na inshij 96 dozvolyayuchi vagam obirati signal mizh nejronami Shtuchni nejroni red ShNM skladayutsya zi shtuchnih nejroniv yaki konceptualno pohodyat vid biologichnih Kozhen shtuchnij nejron maye vhodi ta vidaye yedinij vihid yakij mozhlivo nadsilati bagatom inshim nejronam 97 Vhodi angl inputs mozhut buti znachennyami oznak zrazka zovnishnih danih takih yak zobrazhennya chi dokumenti abo voni mozhut buti vihodami inshih nejroniv Vihodi kincevih nejroniv vihodu angl output neurons nejronnoyi merezhi zavershuyut zavdannya napriklad rozpiznavannya ob yekta na zobrazhenni Shobi znajti vihid nejrona mi beremo zvazhenu sumu vsih vhodiv zvazhenih za vagami z yednan angl connection weights vid vhodiv do nejrona Mi dodayemo do ciyeyi sumi zmishennya angl bias 98 Cyu zvazhenu sumu inodi nazivayut zbudzhennyam angl activation Cyu zvazhenu sumu potim propuskayut kriz zazvichaj nelinijnu peredavalnu funkciyu angl activation function dlya otrimannya vihodu Pervinnimi vhodami ye zovnishni dani napriklad zobrazhennya ta dokumenti Kincevi vihodi zavershuyut zavdannya napriklad rozpiznavannya ob yekta na zobrazhenni 99 Budova red Nejroni zazvichaj vporyadkovano v kilka shariv angl layers osoblivo v glibokomu navchanni Nejroni odnogo sharu z yednuyutsya lishe z nejronami bezposeredno poperednogo j nastupnogo shariv Shar yakij otrimuye zovnishni dani ce shar vhodu angl input layer Shar yakij vidaye kincevij rezultat ce shar vihodu angl output layer Mizh nimi ye nul abo bilshe prihovanih shariv angl hidden layers Vikoristovuyut takozh odnosharovi angl single layer ta bezsharovi angl unlayered merezhi Mizh dvoma sharami mozhlivi kilka shem z yednannya Voni mozhut buti povnoz yednanimi angl fully connected koli kozhen nejron odnogo sharu z yednuyetsya z kozhnim nejronom nastupnogo sharu Voni mozhut buti agreguvalnimi angl pooling koli grupa nejroniv odnogo sharu z yednuyetsya z odnim nejronom nastupnogo sharu znizhuyuchi takim chinom kilkist nejroniv u comu shari 100 Nejroni lishe z takimi zv yazkami utvoryuyut oriyentovanij aciklichnij graf i vidomi yak merezhi pryamogo poshirennya angl feedforward networks 101 Krim togo merezhi yaki dozvolyayut z yednannya do nejroniv u tomu zhe abo poperednih sharah vidomi yak rekurentni merezhi angl recurrent networks 102 Giperparametr red Dokladnishe Giperparametr mashinne navchannya Giperparametr angl hyperparameter ce stalij parametr chiye znachennya vstanovlyuyut pered pochatkom procesu navchannya Znachennya zhe parametriv angl parameters vivodyat shlyahom navchannya Do prikladiv giperparametriv nalezhat temp navchannya angl learning rate kilkist prihovanih shariv i rozmir paketa 103 Znachennya deyakih giperparametriv mozhut zalezhati vid znachen inshih giperparametriv Napriklad rozmir deyakih shariv mozhe zalezhati vid zagalnoyi kilkosti shariv Navchannya red Cej rozdil mistit perelik posilan ale pohodzhennya tverdzhen u nomu zalishayetsya nezrozumilim cherez praktichno povnu vidsutnist vnutrishnotekstovih dzherel vinosok Bud laska dopomozhit polipshiti cej rozdil peretvorivshi dzherela z pereliku posilan na dzherela vinoski u samomu teksti rozdilu lipen 2023 Div takozh Matematichna optimizaciya Teoriya ocinyuvannya ta Mashinne navchannya Navchannya angl learning ce pristosovuvannya merezhi dlya krashogo vikonannya zavdannya shlyahom rozglyadu vibirkovih sposterezhen Navchannya vklyuchaye pidlashtovuvannya vag i mozhlivo porogiv merezhi dlya pidvishennya tochnosti rezultativ Ce zdijsnyuyetsya shlyahom minimizuvannya sposterezhuvanih pohibok Navchannya zaversheno yaksho rozglyad dodatkovih sposterezhen ne znizhuye rivnya pohibki Navit pislya navchannya riven pohibki zazvichaj ne dosyagaye 0 Yaksho navit pislya navchannya riven pohibki zanadto visokij zazvichaj potribno zminiti budovu merezhi Praktichno ce zdijsnyuyut shlyahom viznachennya funkciyi vitrat angl cost function yaku periodichno ocinyuyut protyagom navchannya Poki yiyi rezultat znizhuyetsya navchannya trivaye Vitrati chasto viznachayut yak statistiku znachennya yakoyi mozhlivo lishe nablizhuvati Vihodi naspravdi ye chislami tozh koli pohibka nizka riznicya mizh rezultatom majzhe napevno kit i pravilnoyu vidpoviddyu kit nevelika Navchannya namagayetsya zniziti zagalnu vidminnist nad sposterezhennyami Bilshist modelej navchannya mozhlivo rozglyadati yak pryame zastosuvannya teoriyi optimizaciyi ta statistichnogo ocinyuvannya 95 104 Temp navchannya red Dokladnishe Temp navchannyaTemp navchannya angl learning rate viznachaye rozmir koriguvalnih krokiv yaki zdijsnyuye model dlya pidlashtovuvannya pid pohibku v kozhnomu sposterezhenni 105 Visokij temp navchannya skorochuye trivalist trenuvannya ale z menshoyu kincevoyu tochnistyu todi yak nizhchij temp navchannya zajmaye bilshe chasu ale z potencialom do bilshoyi tochnosti Taki optimizaciyi yak Quickprop en ukr shvidposhir perevazhno spryamovani na priskorennya minimizuvannya pohibki todi yak inshi vdoskonalennya perevazhno namagayutsya pidvishiti nadijnist Shobi zapobigti ciklichnim kolivannyam useredini merezhi takim yak cherguvannya vag z yednan i pokrashiti shvidkist zbigannya udoskonalennya vikoristovuyut adaptivnij temp navchannya yakij pidvishuyetsya abo znizhuyetsya nalezhnim chinom 106 Koncepciya impulsu angl momentum dozvolyaye zvazhuvati balans mizh gradiyentom i poperednoyu zminoyu tak shobi pidlashtovuvannya vagi pevnoyu miroyu zalezhalo vid poperednoyi zmini Impuls blizkij do 0 dodaye vagi gradiyentovi todi yak znachennya blizke do 1 dodaye vagi krajnij zmini Funkciya vitrat red Hocha j mozhlivo viznachati funkciyu vitrat ad hoc vibir chasto viznachayetsya bazhanimi vlastivostyami ciyeyi funkciyi takimi yak opuklist abo tim sho vona postaye z modeli napriklad u jmovirnisnij modeli aposteriornu jmovirnist modeli mozhlivo vikoristovuvati yak oberneni vitrati Zvorotne poshirennya red Dokladnishe Zvorotne poshirennyaZvorotne poshirennya angl backpropagation ce metod yakij vikoristovuyut dlya pidlashtovuvannya vag z yednan dlya kompensuvannya kozhnoyi pomilki viyavlenoyi pid chas navchannya Velichina pomilki faktichno rozpodilyayetsya mizh z yednannyami Tehnichno zvorotne poshirennya obchislyuye gradiyent pohidnu funkciyi vitrat pov yazanij iz zadanim stanom vidnosno vag Utochnyuvannya vag mozhlivo zdijsnyuvati za dopomogoyu stohastichnogo gradiyentnogo spusku angl stochastic gradient descent abo inshih metodiv takih yak mashini ekstremalnogo navchannya 107 bezposhirni angl no prop merezhi 108 trenuvannya bez vertannya 109 bezvagovi angl weightless merezhi 110 111 ta ne konektivistski nejronni merezhi en dzherelo Paradigmi navchannya red Cej rozdil mistit perelik posilan ale pohodzhennya tverdzhen u nomu zalishayetsya nezrozumilim cherez praktichno povnu vidsutnist vnutrishnotekstovih dzherel vinosok Bud laska dopomozhit polipshiti cej rozdil peretvorivshi dzherela z pereliku posilan na dzherela vinoski u samomu teksti rozdilu lipen 2023 Mashinne navchannya zazvichaj podilyayut na tri osnovni paradigmi kerovane navchannya 112 113 114 115 nekerovane navchannya 116 113 114 117 115 ta navchannya z pidkriplennyam 118 119 Kozhna vidpovidaye pevnomu navchalnomu zavdannyu Kerovane navchannya red Kerovane navchannya 113 114 115 angl supervised learning vikoristovuye nabir par vhodiv i bazhanih vihodiv Zavdannya navchannya polyagaye v tomu shobi dlya kozhnogo vhodu vidavati bazhanij vihid U comu vipadku funkciya vitrat pov yazana z usunennyam nepravilnogo visnovuvannya 120 Vitrati yaki vikoristovuyut zazvichaj ce serednokvadratichna pohibka yaka namagayetsya minimizuvati serednyu kvadratichnu pohibku vihodu merezhi vidnosno bazhanogo vihodu Dlya kerovanogo navchannya pidhodyat zavdannya na rozpiznavannya obraziv takozh vidome yak klasifikuvannya ta regresiyu takozh vidome yak nablizhennya funkciyi Kerovane navchannya takozh zastosovne do poslidovnih danih napriklad dlya rozpiznavannya rukopisnogo tekstu movlennya ta zhestiv en Jogo mozhlivo rozglyadati yak navchannya z uchitelem u viglyadi funkciyi yaka zabezpechuye bezperervnij zvorotnij zv yazok shodo yakosti otrimanih na danij moment rishen Nekerovane navchannya red U nekerovanim navchanni 113 114 117 115 angl unsupervised learning dani vhodu nadayutsya razom iz funkciyeyu vitrat deyakoyu funkciyeyu vid danih x displaystyle textstyle x nbsp ta vihodu merezhi Funkciya vitrat zalezhit vid zavdannya oblasti modeli ta bud yakih apriornih pripushen neyavnih vlastivostej modeli yiyi parametriv ta sposterezhuvanih zminnih Yak trivialnij priklad rozglyanmo model f x a displaystyle textstyle f x a nbsp de a displaystyle textstyle a nbsp stala a vitrati C E x f x 2 displaystyle textstyle C E x f x 2 nbsp Minimizaciya cih vitrat daye znachennya a displaystyle textstyle a nbsp sho dorivnyuye serednomu znachennyu danih Funkciya vitrat mozhe buti nabagato skladnishoyu Yiyi viglyad zalezhit vid zastosuvannya napriklad u stisnenni vona mozhe buti pov yazanoyu iz vzayemnoyu informaciyeyu mizh x displaystyle textstyle x nbsp ta f x displaystyle textstyle f x nbsp todi yak u statistichnomu modelyuvanni vona mozhe buti pov yazanoyu z aposteriornoyu jmovirnistyu modeli za zadanih danih zvernit uvagu sho v oboh cih prikladah ci velichini pidlyagayut maksimizuvannyu a ne minimizuvannyu Zavdannya yaki pidpadayut pid paradigmu nekerovanogo navchannya ce zazvichaj zadachi ocinyuvannya do cih zastosuvan nalezhat klasteruvannya ocinyuvannya statistichnih rozpodiliv stiskannya ta filtruvannya Navchannya z pidkriplennyam red Dokladnishe Navchannya z pidkriplennyamDiv takozh Stohastichne keruvannya en U takih zastosuvannyah yak gra u videoigri diyach angl actor vikonuye nizku dij angl actions otrimuyuchi zagalom neperedbachuvanij vidguk vid seredovisha pislya kozhnoyi z nih Meta polyagaye v tomu shobi vigrati gru tobto poroditi najbilshu kilkist pozitivnih z najmenshimi vitratami vidgukiv U navchanni z pidkriplennyam angl reinforcement learning meta polyagaye v tomu shobi zvazhiti merezhu rozrobiti strategiyu angl policy dlya vikonannya dij yaka minimizuye dovgostrokovi ochikuvani sukupni vitrati U kozhen moment chasu diyach vikonuye diyu a seredovishe porodzhuye sposterezhennya ta mittyevi vitrati vidpovidno do deyakih zazvichaj nevidomih pravil Zazvichaj pravila j dovgostrokovi vitrati mozhlivo lishe ocinyuvati U bud yakij moment diyach virishuye chi dosliditi novi diyi shob rozkriti svoyi vitrati a chi skoristatisya poperednim znannyam dlya shvidshogo vikonannya Formalno seredovishe modelyuyut yak markovskij proces virishuvannya MPV zi stanami s 1 s n S displaystyle textstyle s 1 s n in S nbsp ta diyami a 1 a m A displaystyle textstyle a 1 a m in A nbsp Oskilki perehodi staniv nevidomi zamist nih vikoristovuyut rozpodili jmovirnosti rozpodil mittyevih vitrat P c t s t displaystyle textstyle P c t s t nbsp rozpodil sposterezhen P x t s t displaystyle textstyle P x t s t nbsp ta rozpodil perehodiv P s t 1 s t a t displaystyle textstyle P s t 1 s t a t nbsp todi yak strategiyu viznachayut yak umovnij rozpodil dij za danih sposterezhen Vzyati razom voni viznachayut markovskij lancyug ML Meta polyagaye u viyavlenni ML iz najmenshimi vitratami ShNM u takih takih zastosuvannyah sluguyut skladovoyu yaka zabezpechuye navchannya 121 122 Dinamichne programuvannya u poyednanni z ShNM sho daye nejrodinamichne programuvannya 123 bulo zastosovano do takih zadach yak ti sho stosuyutsya marshrutizuvanya transportu en 124 videoigor prirodokoristuvannya 125 126 ta medicini 127 cherez zdatnist ShNM pom yakshuvati vtrati tochnosti navit pri zmenshenni shilnosti gratki diskretuvannya en dlya chiselnogo nablizhennya rozv yazkiv zadach keruvannya Zavdannya yaki pidpadayut pid paradigmu navchannya z pidkriplennyam ce zavdannya keruvannya igri ta inshi poslidovni zavdannya uhvalyuvannya rishen Samonavchannya red Samonavchannya angl self learning v nejronnih merezhah bulo zaproponovano 1982 roku razom iz nejronnoyu merezheyu zdatnoyu do samonavchannya nazvanoyu poperechinnim adaptivnim masivom PAM angl crossbar adaptive array CAA 128 Ce sistema lishe z odnim vhodom situaciyeyu s j lishe odnim vihodom diyeyu abo povedinkoyu a Vona ne maye ani vhodu zovnishnih porad ani vhodu zovnishnogo pidkriplennya z boku seredovisha PAM obchislyuye poperechnim chinom yak rishennya shodo dij tak i emociyi pochuttya shodo viniklih situacij Cya sistema keruyetsya vzayemodiyeyu mizh piznannyam ta emociyami 129 Za zadanoyi matrici pam yati W w a s poperechinnij algoritm samonavchannya na kozhnij iteraciyi vikonuye nastupne obchislennya U situaciyi s vikonati diyu a Otrimati naslidkovu situaciyu s Obchisliti emociyu perebuvannya v naslidkovij situaciyi v s Utochniti poperechinnu pam yat w a s w a s v s Poshiryuvane zvorotno znachennya vtorinne pidkriplennya angl secondary reinforcement ce emociya shodo naslidkiv situaciyi PAM isnuye u dvoh seredovishah odne povedinkove seredovishe de vona povoditsya a inshe genetichne seredovishe de vona spochatku j lishe odin raz otrimuye pochatkovi emociyi shodo situacij z yakimi mozhlivo zitknutisya v povedinkovomu seredovishi Otrimavshi genomnij vektor vidovij vektor angl genome vector species vector iz genetichnogo seredovisha PAM navchatimetsya cilespryamovanoyi povedinki v povedinkovomu seredovishi sho mistit yak bazhani tak i nebazhani situaciyi 130 Nejroevolyuciya red Dokladnishe NejroevolyuciyaNejroevolyuciya angl neuroevolution mozhe stvoryuvati topologiyi ta vagi nejronnoyi merezhi za dopomogoyu evolyucijnogo obchislennya Zavdyaki suchasnim vdoskonalennyam nejroevolyuciya konkuruye zi skladnimi pidhodami gradiyentnogo spusku 131 Odna z perevag nejroevolyuciyi polyagaye v tomu sho vona mozhe buti mensh shilnoyu potraplyati v gluhij kut 132 Stohastichna nejronna merezha red Stohastichni nejronni merezhi angl stochastic neural networks sho pohodyat vid modelej Sherringtona Kirkpatrika en ce odin z tipiv shtuchnih nejronnih merezh pobudovanij shlyahom vvedennya vipadkovih variacij u merezhu abo nadavannyam shtuchnim nejronam merezhi stohastichnih peredavalnih funkcij abo nadavannyam yim stohastichnih vag Ce robit yih korisnimi instrumentami dlya rozv yazuvannya zadach optimizaciyi oskilki vipadkovi fluktuaciyi dopomagayut merezhi unikati lokalnih minimumiv 133 Stohastichni nejronni merezhi trenovani za dopomogoyu bayesovogo pidhodu vidomi yak bayesovi nejronni merezhi angl Bayesian neural network 134 Inshi red U bayesovij sistemi obirayut rozpodil nad naborom dozvolenih modelej takim chinom shobi minimizuvati vitrati Inshimi algoritmami navchannya ye evolyucijni metodi 135 genno ekspresijne programuvannya en 136 imituvannya vidpalyuvannya 137 ochikuvannya maksimizaciya neparametrichni metodi en ta metod royu chastinok 138 Zbizhna rekursiya angl convergent recursion ce algoritm navchannya dlya nejronnih merezh artikulyacijnih kontroleriv mozochkovoyi modeli en AKMM angl cerebellar model articulation controller CMAC 139 140 Rezhimi red Cej rozdil mistit perelik posilan ale pohodzhennya tverdzhen u nomu zalishayetsya nezrozumilim cherez praktichno povnu vidsutnist vnutrishnotekstovih dzherel vinosok Bud laska dopomozhit polipshiti cej rozdil peretvorivshi dzherela z pereliku posilan na dzherela vinoski u samomu teksti rozdilu lipen 2023 Ye dva rezhimi navchannya stohastichnij angl stochastic ta paketnij angl batch U stohastichnomu navchanni kozhen vhid stvoryuye pidlashtovuvannya vag U paketnomu navchanni vagi pidlashtovuyut na osnovi paketu vhodiv nakopichuyuchi pohibki v paketi Stohastichne navchannya vnosit shum do procesu vikoristovuyuchi lokalnij gradiyent rozrahovanij z odniyeyi tochki danih ce znizhuye shans zastryagannya merezhi v lokalnih minimumah Prote paketne navchannya zazvichaj daye shvidshij i stabilnishij spusk do lokalnogo minimumu oskilki kozhne utochnennya vikonuyetsya v napryamku userednenoyi pohibki paketa Poshirenim kompromisom ye vikoristannya minipaketiv angl mini batches nevelikih paketiv zi zrazkami v kozhnomu paketi obranimi stohastichno z usogo naboru danih Tipi red Dokladnishe Tipi shtuchnih nejronnih merezhShNM evolyuciyuvali u shiroke simejstvo metodik yaki vdoskonalili riven ostannih dosyagnen u bagatoh oblastyah Najprostishi tipi mayut odin abo kilka statichnih skladovih vklyuchno z kilkistyu vuzliv kilkistyu shariv vagami vuzliv i topologiyeyu Dinamichni tipi dozvolyayut odnomu abo dekilkom iz nih evolyuciyuvati shlyahom navchannya Ostannye nabagato skladnishe ale mozhe skorochuvati periodi navchannya j davati krashi rezultati Deyaki tipi dozvolyayut vimagayut navchannya pid keruvannyam operatora todi yak inshi pracyuyut nezalezhno Deyaki tipi pracyuyut viklyuchno aparatno todi yak inshi ye suto programnimi j pracyuyut na komp yuterah zagalnogo priznachennya Do deyakih z osnovnih proriviv nalezhat zgortkovi nejronni merezhi yaki viyavilisya osoblivo uspishnimi v obrobci vizualnih ta inshih dvovimirnih danih 141 142 dovga korotkochasna pam yat sho dozvolyaye unikati problemi znikannya gradiyenta 143 j mozhe obroblyati signali yaki mistyat sumish nizko ta visokochastotnih skladovih sho dopomagaye v rozpiznavanni movlennya z velikim slovnikovim zapasom 144 145 sintezuvanni movlennya z tekstu 146 40 147 ta fotorealistichnih golovah sho rozmovlyayut 148 konkurentni merezhi angl competitive networks taki yak porodzhuvalni zmagalni merezhi 149 v yakih chislenni merezhi riznoyi strukturi zmagayutsya odna z odnoyu v takih zavdannyah yak peremoga v gri abo vvedennya oponenta v omanu shodo avtentichnosti vhodu 71 Pobudova merezh red Dokladnishe Poshuk nejroarhitekturi en Poshuk nejronnoyi arhitekturi PNA angl neural architecture search NAS vikoristovuye mashinne navchannya dlya avtomatizuvannya pobudovi ShNM Rizni pidhodi do PNA pobuduvali merezhi dobre porivnyanni z sistemami rozroblenimi vruchnu Osnovnim algoritmom cogo poshuku ye proponuvati model kandidatku ocinyuvati yiyi za naborom danih i vikoristovuvati rezultati yak zvorotnij zv yazok dlya navchannya merezhi PNA 150 Sered dostupnih sistem AvtoMN ta AutoKeras 151 Do problem pobudovi nalezhat viznachennya kilkosti tipu ta z yednanosti rivniv merezhi a takozh rozmiru kozhnogo ta tipu z yednannya povne agreguvalne Giperparametri takozh slid viznachati yak chastinu pobudovi yih ne navchayutsya keruyuchi takimi pitannyami yak kilkist nejroniv u kozhnomu shari temp navchannya krok krok filtriv angl stride glibina receptivne pole ta dopovnennya dlya ZNM tosho 152 Vikoristannya red Cej rozdil ne mistit posilan na dzherela Vi mozhete dopomogti polipshiti cej rozdil dodavshi posilannya na nadijni avtoritetni dzherela Material bez dzherel mozhe buti piddano sumnivu ta vilucheno lipen 2023 Vikoristannya shtuchnih nejronnih merezh vimagaye rozuminnya yihnih harakteristik Vibir modeli Ce zalezhit vid podannya danih ta zastosuvannya Nadmirno skladni modeli navchayutsya povilno Algoritm navchannya Isnuyut chislenni kompromisi mizh algoritmami navchannya Majzhe kozhen algoritm pracyuvatime dobre z pravilnimi giperparametrami 153 dlya trenuvannya na pevnomu nabori danih Prote obrannya ta nalashtuvannya algoritmu dlya navchannya na nebachenih danih vimagaye znachnogo eksperimentuvannya Robastnist Yaksho model funkciyu vitrat ta algoritm navchannya obrano nalezhnim chinom to otrimana ShNM mozhe stati robastnoyu Mozhlivosti ShNM pidpadayut pid nastupni shiroki kategoriyi 154 Nablizhuvannya funkcij en 155 abo regresijnij analiz 156 vklyuchno z peredbachuvannyam chasovih ryadiv nablizhuvannyam dopasovanosti en 157 ta modelyuvannyam Klasifikuvannya vklyuchno z rozpiznavannyam obraziv ta poslidovnostej viyavlyannyam novizni en ta poslidovnim uhvalyuvannyam rishen 158 Obrobka danih 159 vklyuchno z filtruvannyam klasteruvannyam slipim viokremlyuvannyam signalu en 160 ta stiskannyam Robototehnika vklyuchno zi skerovuvannyam manipulyatoriv ta proteziv Zastosuvannya red Zavdyaki svoyij zdatnosti vidtvoryuvati ta modelyuvati nelinijni procesi shtuchni nejronni merezhi znajshli zastosuvannya v bagatoh disciplinah Do sfer zastosuvannya nalezhat identifikuvannya sistem en ta keruvannya nimi keruvannya transportnimi zasobami peredbachuvannya trayektoriyi 161 keruvannya procesami prirodokoristuvannya kvantova himiya 162 universalna gra v igri en 163 rozpiznavannya obraziv radarni sistemi vstanovlyuvannya oblich klasifikuvannya signaliv 164 trivimirna vidbudova 165 rozpiznavannya ob yektiv tosho analiz danih davachiv 166 rozpiznavannya poslidovnostej rozpiznavannya zhestiv movlennya rukopisnogo ta drukovanogo tekstu 167 medichna diagnostika finansi 168 napriklad peredpodijni en modeli dlya okremih finansovih dovgotrivalih prognoziv ta shtuchni finansovi rinki en dobuvannya danih unaochnyuvannya mashinnij pereklad socialnomerezhne filtruvannya 169 ta filtruvannya spamu elektronnoyi poshti en ShNM vikoristovuvali dlya diagnostuvannya kilkoh tipiv raku 170 171 ta dlya vidriznyuvannya visokoinvazivnih linij rakovih klitin vid mensh invazivnih z vikoristannyam lishe informaciyi pro formu klitin 172 173 ShNM vikoristovuvali dlya priskoryuvannya analizu nadijnosti infrastrukturi sho piddayetsya stihijnim liham 174 175 i dlya prognozuvannya prosidannya fundamentiv 176 Takozh mozhe buti korisnim pom yakshuvati poveni shlyahom vikoristannya ShNM dlya modelyuvannya doshovogo stoku 177 ShNM takozh vikoristovuvali dlya pobudovi chornoskrinkovih modelej v geonaukah gidrologiyi 178 179 modelyuvanni okeanu ta priberezhnij inzheneriyi en 180 181 ta geomorfologiyi 182 ShNM vikoristovuyut u kiberbezpeci z metoyu rozmezhovuvannya zakonnoyi diyalnosti vid zlovmisnoyi Napriklad mashinne navchannya vikoristovuvali dlya klasifikuvannya zlovmisnogo programnogo zabezpechennya pid Android 183 dlya viznachannya domeniv sho nalezhat sub yektam zagrozi i dlya viyavlyannya URL adres yaki stanovlyat zagrozu bezpeci 184 Vedutsya doslidzhennya sistem ShNM priznachenih dlya viprobuvannya na proniknennya dlya viyavlyannya bot merezh 185 shahrajstva z kreditnimi kartkami 186 ta merezhnih vtorgnen ShNM proponuvali yak instrument dlya rozv yazuvannya chastinnih diferencialnih rivnyan u fizici 187 188 189 ta modelyuvannya vlastivostej bagatochastinkovih vidkritih kvantovih sistem en 190 191 192 193 U doslidzhenni mozku ShNM vivchali korotkochasnu povedinku okremih nejroniv 194 dinamiku nejronnih lancyugiv sho vinikaye cherez vzayemodiyu mizh okremimi nejronami ta te yak povedinka mozhe vinikati z abstraktnih nejronnih moduliv yaki podayut cili pidsistemi Doslidzhennya rozglyadali dovgostrokovu ta korotkochasnu plastichnist nejronnih sistem ta yihnij zv yazok iz navchannyam i pam yattyu vid okremogo nejrona do sistemnogo rivnya Teoretichni vlastivosti red Obchislyuvalna potuzhnist red Yak dovedeno teoremoyu Cibenka bagatosharovij perceptron ce universalnij en nablizhuvach funkcij Prote ce dovedennya ne konstruktivne shodo kilkosti neobhidnih nejroniv topologiyi merezhi vag ta parametriv navchannya Osobliva rekurentna arhitektura z racionalnoznachnimi vagami na protivagu do povnotochnisnih dijsnoznachnih vag maye potuzhnist universalnoyi mashini Tyuringa 195 vikoristovuyuchi skinchennu kilkist nejroniv ta standartni linijni z yednannya Krim togo vikoristannya irracionalnih znachen dlya vag daye v rezultati mashinu z nadtyuringovoyu potuzhnistyu 196 197 vidsutnye v dzhereli Yemnist red Vlastivist yemnosti 198 199 angl capacity modeli vidpovidaye yiyi zdatnosti modelyuvati bud yaku zadanu funkciyu Vona pov yazana z obsyagom informaciyi yakij mozhlivo zberegti v merezhi ta z ponyattyam skladnosti Sered spilnoti vidomi dva ponyattya yemnosti informacijna yemnist ta VCh rozmirnist Informacijnu yemnist angl information capacity perceptrona retelno obgovoreno v knizi sera Devida Makkeya 200 yaka pidsumovuye robotu Tomasa Kovera 201 Yemnist merezhi standartnih nejroniv ne zgortkovih mozhlivo otrimuvati za chotirma pravilami 202 yaki viplivayut iz rozuminnya nejrona yak elektrichnogo elementa Informacijna yemnist ohoplyuye funkciyi yaki mozhlivo zmodelyuvati merezheyu za dovilnih danih vhodu Druge ponyattya VCh rozmirnist angl VC Dimension VCh rozmirnist vikoristovuye principi teoriyi miri ta znahodit maksimalnu yemnist za najkrashih mozhlivih obstavin Ce za danih vhodu pevnogo viglyadu Yak zaznacheno u 200 VCh rozmirnist dlya dovilnih vhodiv stanovit polovinu informacijnoyi yemnosti perceptrona VCh rozmirnist dlya dovilnih tochok inodi nazivayut yemnistyu pam yati angl Memory Capacity 203 Zbizhnist red Modeli mozhut ne zbigatisya poslidovno na yedinomu rozv yazku po pershe cherez mozhlivist isnuvannya lokalnih minimumiv zalezhno vid funkciyi vitrat ta modeli Po druge vzhivanij metod optimizaciyi mozhe ne garantuvati zbizhnosti yaksho vin pochinayetsya daleko vid bud yakogo lokalnogo minimumu Po tretye dlya dosit velikih danih abo parametriv deyaki metodi stayut nepraktichnimi Insha varta zgadki problema polyagaye v tomu sho navchannya mozhe prohoditi kriz deyaku sidlovu tochku sho mozhe prizvoditi do zbigannya v nepravilnomu napryamku Povedinka zbizhnosti pevnih tipiv arhitektur ShNM zrozumilisha nizh inshih Koli shirina merezhi nablizhayetsya do neskinchennosti ShNM dobre opisuyetsya svoyim rozvinennyam u ryad Tejlora pershogo poryadku protyagom navchannya i tomu uspadkovuye povedinku zbizhnosti afinnih modelej en 204 205 Inshij priklad koli parametri mali sposterigayetsya sho ShNM chasto dopasovuyutsya do cilovih funkcij vid nizkih do visokih chastot Taku povedinku nazivayut spektralnim zmishennyam angl spectral bias abo chastotnim principom angl frequency principle nejronnih merezh 206 207 208 209 Ce yavishe protilezhne povedinci deyakih dobre vivchenih iteracijnih chislovih shem takih yak metod Yakobi Bulo viyavleno sho glibshi nejronni merezhi shilnishi do nizkochastotnih funkcij 210 Uzagalnyuvalnist ta statistika red Cej rozdil mistit perelik posilan ale pohodzhennya tverdzhen u nomu zalishayetsya nezrozumilim cherez praktichno povnu vidsutnist vnutrishnotekstovih dzherel vinosok Bud laska dopomozhit polipshiti cej rozdil peretvorivshi dzherela z pereliku posilan na dzherela vinoski u samomu teksti rozdilu lipen 2023 Zastosuvannya metoyu yakih ye stvorennya sistemi sho dobre uzagalnyuyetsya do nevidomih zrazkiv stikayutsya z mozhlivistyu peretrenuvannya Vono vinikaye v zaplutanih abo nadmirno viznachenih sistemah koli yemnist merezhi znachno perevishuye potrebu u vilnih parametrah Isnuye dva pidhodi yak vporuvatisya z peretrenuvannyam Pershij polyagaye u vikoristanni perehresnogo zatverdzhuvannya ta podibnih metodiv dlya perevirki nayavnosti perenavchannya ta obiranni giperparametriv dlya zvedennya pohibki uzagalnennya do minimumu Drugij polyagaye u vikoristanni yakogos iz vidiv regulyarizaciyi Ce ponyattya vinikaye v imovirnisnij bayesovij sistemi de regulyarizaciyu mozhlivo vikonuvati shlyahom obirannya bilshoyi apriornoyi jmovirnosti nad prostishimi modelyami ale takozh i v teoriyi statistichnogo navchannya de metoyu ye zvoditi do minimumu dvi velichini empirichnij rizik ta strukturnij rizik sho grubo vidpovidayut pohibci nad trenuvalnim naborom ta peredbachuvanij pohibci v nebachenih danih cherez perenavchannya nbsp Dovirchij analiz nejronnoyi merezhiNejronni merezhi kerovanogo navchannya yaki vikoristovuyut yak funkciyu vitrat serednokvadratichnu pohibku SKP dlya viznachennya doviri do trenovanoyi modeli mozhut vikoristovuvati formalni statistichni metodi SKP na zatverdzhuvalnomu nabori mozhlivo vikoristovuvati yak ocinku dispersiyi Ce znachennya potim mozhlivo vikoristovuvati dlya obchislennya dovirchogo intervalu vihodu merezhi vihodyachi z normalnogo rozpodilu Zdijsnenij takim chinom analiz doviri statistichno chinnij poki rozpodil imovirnosti vihodu zalishayetsya nezminnim i ne vnositsya zmin do merezhi Priznachennya normovanoyi eksponencijnoyi funkciyi uzagalnennya logistichnoyi funkciyi yak peredavalnoyi funkciyi sharu vihodu nejronnoyi merezhi abo normovanoyi eksponencijnoyi skladovoyi v nejronnij merezhi na osnovi skladovih dlya kategorijnih cilovih zminnih daye mozhlivist interpretuvati vihodi yak aposteriorni jmovirnosti Ce korisno dlya klasifikuvannya oskilki daye miru vpevnenosti v klasifikaciyah Normovana eksponencijna funkciya angl softmax ce y i e x i j 1 c e x j displaystyle y i frac e x i sum j 1 c e x j nbsp Kritika red Trenuvannya red Poshirena kritika nejronnih merezh osoblivo v robototehnici polyagaye v tomu sho dlya roboti v realnomu sviti voni vimagayut zabagato trenuvannya 211 Do potencijnih rozv yazan nalezhit vipadkove perestavlyannya trenuvalnih zrazkiv zastosuvannya algoritmu chiselnoyi optimizaciyi yakij ne vimagaye zavelikih krokiv pri zmini z yednan merezhi slidom za zrazkom grupuvannya zrazkiv do tak zvanih mini paketiv angl mini batches ta abo zaprovadzhennya algoritmu rekursivnih najmenshih kvadrativ dlya AKMM en 139 Teoriya red Golovna pretenziya dzherelo ShNM polyagaye v tomu sho voni vtilyuyut novi potuzhni zagalni principi obrobki informaciyi Ci principi pogano viznacheni Chasto stverdzhuyut hto sho voni vinikayut iz samoyi merezhi Ce dozvolyaye opisuvati prostu statistichnu asociaciyu osnovnu funkciyu shtuchnih nejronnih merezh yak navchannya abo rozpiznavannya 1997 roku Oleksandr Dyudni en zauvazhiv sho v rezultati shtuchni nejronni merezhi mayut risi chogos darmovogo chogos nadilenogo osoblivoyu auroyu ledarstva ta viraznoyi vidsutnosti zacikavlennya hoch bi tim naskilki dobrimi ci komp yuterni sistemi ye Zhodnogo vtruchannya lyudskoyi ruki ta rozumu rozv yazki znahodyatsya mov charivnoyu siloyu i nihto shozhe tak nichogo j ne navchivsya 212 Odniyeyu z vidpovidej Dyudni ye te sho nejronni merezhi rozv yazuyut bagato skladnih i riznomanitnih zavdan pochinayuchi vid avtonomnogo litalnogo aparata 213 do viyavlyannya shahrajstva z kreditnimi kartkami j zavershuyuchi opanuvannyam gri v Go Pismennik u galuzi tehnologij Rodzher Bridzhmen prokomentuvav ce tak Nejronni merezhi napriklad znahodyatsya na lavi pidsudnih ne lishe cherez te sho yih rozreklamuvali do nebes hiba ni a j cherez te sho vi mozhete stvoriti uspishnu merezhu ne rozumiyuchi yak vona pracyuye kupa chisel yaki fiksuyut yiyi povedinku jmovirno bude neprozoroyu nechitabelnoyu tabliceyu nichogo ne vartoyu yak naukovij resurs Nezvazhayuchi na svoyu ekspresivnu zayavu pro te sho nauka ce ne tehnologiya Dyudni zdayetsya ganbit nejronni merezhi yak poganu nauku todi yak bilshist iz tih hto yih rozroblyaye prosto namagayutsya buti dobrimi inzhenerami Nechitabelna tablicya yaku mozhe chitati korisna mashina vse odno bude velmi varta togo shobi yiyi mati 214 Originalnij tekst angl Neural networks for instance are in the dock not only because they have been hyped to high heaven what hasn t but also because you could create a successful net without understanding how it worked the bunch of numbers that captures its behaviour would in all probability be an opaque unreadable table valueless as a scientific resource In spite of his emphatic declaration that science is not technology Dewdney seems here to pillory neural nets as bad science when most of those devising them are just trying to be good engineers An unreadable table that a useful machine could read would still be well worth having Biologichnij mozok vikoristovuye yak negliboki tak i gliboki shemi yak povidomlyaye anatomiya mozku 215 demonstruyuchi shirokij spektr invariantnosti Veng 216 stverdzhuvav sho mozok samostijno vstanovlyuye zv yazki v osnovnomu vidpovidno do statistiki signaliv i tomu poslidovnij kaskad ne mozhe vloviti vsi osnovni statistichni zalezhnosti Aparatne zabezpechennya red Veliki j efektivni nejronni merezhi vimagayut znachnih obchislyuvalnih resursiv 217 U toj chas yak mozok maye aparatne zabezpechennya idealno pristosovane dlya zadachi obrobki signaliv grafom nejroniv imitaciya navit sproshenogo nejronu na arhitekturi fon Nejmana mozhe spozhivati velicheznu kilkist pam yati ta diskovogo prostoru Krim togo rozrobnikovi chasto potribno peredavati signali bagatma cimi z yednannyami ta pov yazanimi z nimi nejronami sho vimagaye velicheznoyi obchislyuvalnoyi potuzhnosti ta chasu CP Shmidhuber zaznachiv sho vidrodzhennya nejronnih merezh u dvadcyat pershomu storichchi znachnoyu miroyu obumovleno dosyagnennyami v aparatnomu zabezpechenni z 1991 do 2015 roku obchislyuvalna potuzhnist osoblivo zabezpechuvana GPZP na GP zrosla priblizno v miljon raziv zrobivshi standartnij algoritm zvorotnogo poshirennya pridatnim dlya navchannya merezh na kilka rivniv glibshih nizh ranishe 21 Vikoristannya priskoryuvachiv takih yak PKVM ta GP mozhe skorochuvati trivalist trenuvannya z misyaciv do dniv 217 Nejromorfna inzheneriya en abo fizichna nejronna merezha en rozv yazuye problemu aparatnogo zabezpechennya bezposeredno stvoryuyuchi mikroshemi vidminni vid fon nejmanovih dlya bezposerednogo vtilennya nejronnih merezh u shemah She odna mikroshema optimizovana dlya obrobki nejronnih merezh zvetsya tenzornim procesorom abo TP angl Tensor Processing Unit TPU 218 Praktichni kontrprikladi red Analizuvati te chogo navchilasya ShNM nabagato legshe nizh analizuvati te chogo navchilasya biologichna nejronna merezha Krim togo doslidniki yaki berut uchast u poshuku algoritmiv navchannya dlya nejronnih merezh postupovo rozkrivayut zagalni principi sho dozvolyayut mashini sho vchitsya buti uspishnoyu Napriklad lokalne j nelokalne navchannya ta negliboka j gliboka arhitektura 219 Gibridni pidhodi red Pribichniki gibridnih en modelej sho poyednuyut nejronni merezhi ta simvolni pidhodi stverdzhuyut sho taka sumish mozhe krashe vlovlyuvati mehanizmi lyudskogo rozumu 220 Galereya red nbsp Odnosharova shtuchna nejronna merezha pryamogo poshirennya Strilki sho vihodyat z x 2 displaystyle scriptstyle x 2 nbsp dlya naochnosti opusheno Ye p vhodiv do ciyeyi merezhi j q vihodiv U cij sistemi znachennya q togo vihodu y q displaystyle scriptstyle y q nbsp obchislyuvatimetsya yak y q K x i w i q b q displaystyle scriptstyle y q K sum x i w iq b q nbsp nbsp Dvosharova shtuchna nejronna merezha pryamogo poshirennya nbsp Shtuchna nejronna merezha nbsp Graf zalezhnostej ShNM nbsp Odnosharova shtuchna nejronna merezha pryamogo poshirennya z 4 vhodami 6 prihovanimi vuzlami ta 2 vihodami Dlya zadanogo stanu polozhennya ta napryamu vivodit znachennya keruvannya dlya kolis nbsp Dvosharova shtuchna nejronna merezha pryamogo poshirennya z 8 vhodami 2 8 prihovanimi vuzlami ta 2 vihodami Dlya zadanogo stanu polozhennya napryamu ta inshih zminnih seredovisha vidaye znachennya keruvannya dlya manevrovih dviguniv nbsp Paralelno konveyerna struktura nejronnoyi merezhi AKMM en Cej algoritm navchannya zdaten zbigatisya za odin krok Div takozh red Avtokoduvalnik ADALINE Biologichno nathneni obchislennya en Gipervimirni obchislennya en Granici velikoyi shirini nejronnih merezh en Katastrofichna interferenciya en Kvantova nejronna merezha Kognitivna arhitektura en Konektivistska ekspertna sistema en Konektomika en Nejronnij gaz Optichna nejronna merezha en Paralelno rozpodilena obrobka Ponyattya mashinnogo navchannya en Proyekt Blue Brain Programne zabezpechennya nejronnih merezh en Rekurentni nejronni merezhi Spajkova nejronna merezha en Stohastichnij papuga en Tenzorno dobutkova merezha en Filosofiya shtuchnogo intelektuVinoski red Dlya keruvannya Bez ruk cherez Ameriku en 1995 go roku znadobilosya lishe kilka vipadkiv lyudskoyi dopomogi Primitki red Hardesty Larry 14 kvitnya 2017 Explained Neural networks MIT News Office Procitovano 2 chervnya 2022 angl Yang Z R Yang Z 2014 Comprehensive Biomedical Physics Karolinska Institute Stockholm Sweden Elsevier s 1 ISBN 978 0 444 53633 4 Arhiv originalu za 28 lipnya 2022 Procitovano 28 lipnya 2022 angl Mansfield Merriman A List of Writings Relating to the Method of Least Squares angl Stigler Stephen M 1981 Gauss and the Invention of Least Squares Ann Stat 9 3 465 474 doi 10 1214 aos 1176345451 angl Bretscher Otto 1995 Linear Algebra With Applications vid 3rd Upper Saddle River NJ Prentice Hall angl a b v g d e zh i k l m n p Schmidhuber Juergen 2022 Annotated History of Modern AI and Deep Learning arXiv 2212 11279 cs NE angl Stigler Stephen M 1986 The History of Statistics The Measurement of Uncertainty before 1900 Cambridge Harvard ISBN 0 674 40340 1 angl Brush Stephen G 1967 History of the Lenz Ising Model Reviews of Modern Physics 39 4 883 893 Bibcode 1967RvMP 39 883B doi 10 1103 RevModPhys 39 883 angl Amari Shun Ichi 1972 Learning patterns and pattern sequences by self organizing nets of threshold elements IEEE Transactions C 21 1197 1206 angl Hopfield J J 1982 Neural networks and physical systems with emergent collective computational abilities Proceedings of the National Academy of Sciences 79 8 2554 2558 Bibcode 1982PNAS 79 2554H PMC 346238 PMID 6953413 doi 10 1073 pnas 79 8 2554 angl McCulloch Warren Walter Pitts 1943 A Logical Calculus of Ideas Immanent in Nervous Activity Bulletin of Mathematical Biophysics 5 4 115 133 doi 10 1007 BF02478259 angl Kleene S C 1956 Representation of Events in Nerve Nets and Finite Automata Annals of Mathematics Studies 34 Princeton University Press s 3 41 Procitovano 17 chervnya 2017 angl Hebb Donald 1949 The Organization of Behavior New York Wiley ISBN 978 1 135 63190 1 angl Farley B G W A Clark 1954 Simulation of Self Organizing Systems by Digital Computer IRE Transactions on Information Theory 4 4 76 84 doi 10 1109 TIT 1954 1057468 angl Haykin 2008 Neural Networks and Learning Machines 3rd edition angl Rosenblatt F 1958 The Perceptron A Probabilistic Model For Information Storage And Organization in the Brain Psychological Review 65 6 386 408 PMID 13602029 doi 10 1037 h0042519 Proignorovano nevidomij parametr citeseerx dovidka angl Werbos P J 1975 Beyond Regression New Tools for Prediction and Analysis in the Behavioral Sciences angl Rosenblatt Frank 1957 The Perceptron a perceiving and recognizing automaton Report 85 460 1 Cornell Aeronautical Laboratory angl Olazaran Mikel 1996 A Sociological Study of the Official History of the Perceptrons Controversy Social Studies of Science 26 3 611 659 JSTOR 285702 doi 10 1177 030631296026003005 angl Minsky Marvin Papert Seymour 1969 Perceptrons An Introduction to Computational Geometry MIT Press ISBN 978 0 262 63022 1 angl a b v Schmidhuber J 2015 Deep Learning in Neural Networks An Overview Neural Networks 61 85 117 PMID 25462637 arXiv 1404 7828 doi 10 1016 j neunet 2014 09 003 angl Ivakhnenko A G 1973 Cybernetic Predicting Devices CCM Information Corporation angl Ivakhnenko A G Lapa Valentin Grigorʹevich 1967 Cybernetics and forecasting techniques American Elsevier Pub Co angl Robbins H Monro S 1951 A Stochastic Approximation Method The Annals of Mathematical Statistics 22 3 400 doi 10 1214 aoms 1177729586 angl Amari Shun ichi 1967 A theory of adaptive pattern classifier IEEE Transactions EC 16 279 307 angl a b Kohonen Teuvo Honkela Timo 2007 Kohonen Network Scholarpedia 2 1 1568 Bibcode 2007SchpJ 2 1568K doi 10 4249 scholarpedia 1568 angl Kohonen Teuvo 1982 Self Organized Formation of Topologically Correct Feature Maps Biological Cybernetics 43 1 59 69 doi 10 1007 bf00337288 angl Von der Malsburg C 1973 Self organization of orientation sensitive cells in the striate cortex Kybernetik 14 2 85 100 PMID 4786750 doi 10 1007 bf00288907 angl Fukushima Kunihiko 1980 Neocognitron A Self organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position Biological Cybernetics 36 4 193 202 PMID 7370364 doi 10 1007 BF00344251 Procitovano 16 listopada 2013 angl Fukushima K 1969 Visual feature extraction by a multilayered network of analog threshold elements IEEE Transactions on Systems Science and Cybernetics 5 4 322 333 doi 10 1109 TSSC 1969 300225 angl Ramachandran Prajit Barret Zoph Quoc V Le 16 zhovtnya 2017 Searching for Activation Functions arXiv 1710 05941 cs NE angl Leibniz Gottfried Wilhelm Freiherr von 1920 The Early Mathematical Manuscripts of Leibniz Translated from the Latin Texts Published by Carl Immanuel Gerhardt with Critical and Historical Notes Leibniz published the chain rule in a 1676 memoir angl Open court publishing Company angl Linnainmaa Seppo 1970 The representation of the cumulative rounding error of an algorithm as a Taylor expansion of the local rounding errors Masters fin University of Helsinki s 6 7 Linnainmaa Seppo 1976 Taylor expansion of the accumulated rounding error BIT Numerical Mathematics 16 2 146 160 doi 10 1007 bf01931367 angl Griewank Andreas 2012 Who Invented the Reverse Mode of Differentiation Optimization Stories Documenta Matematica Extra Volume ISMP s 389 400 angl Griewank Andreas Walther Andrea 2008 Evaluating Derivatives Principles and Techniques of Algorithmic Differentiation Second Edition SIAM ISBN 978 0 89871 776 1 angl Rosenblatt Frank 1962 Principles of Neurodynamics Spartan New York angl Kelley Henry J 1960 Gradient theory of optimal flight paths ARS Journal 30 10 947 954 doi 10 2514 8 5282 angl A gradient method for optimizing multi stage allocation processes Proceedings of the Harvard Univ Symposium on digital computers and their applications April 1961 angl a b v Schmidhuber Jurgen 2015 Deep Learning Scholarpedia 10 11 85 117 Bibcode 2015SchpJ 1032832S doi 10 4249 scholarpedia 32832 angl Dreyfus Stuart E 1 veresnya 1990 Artificial neural networks back propagation and the Kelley Bryson gradient procedure Journal of Guidance Control and Dynamics 13 5 926 928 Bibcode 1990JGCD 13 926D ISSN 0731 5090 doi 10 2514 3 25422 angl Mizutani E Dreyfus S E Nishio K 2000 On derivation of MLP backpropagation from the Kelley Bryson optimal control gradient formula and its application Proceedings of the IEEE INNS ENNS International Joint Conference on Neural Networks IJCNN 2000 Neural Computing New Challenges and Perspectives for the New Millennium IEEE 167 172 vol 2 ISBN 0 7695 0619 4 doi 10 1109 ijcnn 2000 857892 angl Dreyfus Stuart 1973 The computational solution of optimal control problems with time lag IEEE Transactions on Automatic Control 18 4 383 385 doi 10 1109 tac 1973 1100330 angl Werbos Paul 1982 Applications of advances in nonlinear sensitivity analysis System modeling and optimization Springer s 762 770 Arhiv originalu za 14 kvitnya 2016 archiveurl vimagaye url dovidka Procitovano 2 lipnya 2017 angl David E Rumelhart Geoffrey E Hinton amp Ronald J Williams Learning representations by back propagating errors Arhivovano 8 bereznya 2021 u Wayback Machine Nature 323 pages 533 536 1986 angl Waibel Alex December 1987 Phoneme Recognition Using Time Delay Neural Networks Meeting of the Institute of Electrical Information and Communication Engineers IEICE Tokyo Japan angl Alexander Waibel en et al Phoneme Recognition Using Time Delay Neural Networks IEEE Transactions on Acoustics Speech and Signal Processing Volume 37 No 3 pp 328 339 March 1989 angl Zhang Wei 1988 Shift invariant pattern recognition neural network and its optical architecture Proceedings of Annual Conference of the Japan Society of Applied Physics angl Zhang Wei 1990 Parallel distributed processing model with local space invariant interconnections and its optical architecture Applied Optics 29 32 4790 7 Bibcode 1990ApOpt 29 4790Z PMID 20577468 doi 10 1364 AO 29 004790 angl LeCun et al Backpropagation Applied to Handwritten Zip Code Recognition Neural Computation 1 pp 541 551 1989 angl J Weng N Ahuja and T S Huang Cresceptron a self organizing neural network which grows adaptively Arhivovano 21 veresnya 2017 u Wayback Machine Proc International Joint Conference on Neural Networks Baltimore Maryland vol I pp 576 581 June 1992 angl J Weng N Ahuja and T S Huang Learning recognition and segmentation of 3 D objects from 2 D images Arhivovano 21 veresnya 2017 u Wayback Machine Proc 4th International Conf Computer Vision Berlin Germany pp 121 128 May 1993 angl J Weng N Ahuja and T S Huang Learning recognition and segmentation using the Cresceptron Arhivovano 25 sichnya 2021 u Wayback Machine International Journal of Computer Vision vol 25 no 2 pp 105 139 Nov 1997 angl LeCun Yann Leon Bottou Yoshua Bengio Patrick Haffner 1998 Gradient based learning applied to document recognition Proceedings of the IEEE 86 11 2278 2324 doi 10 1109 5 726791 Procitovano 7 zhovtnya 2016 Proignorovano nevidomij parametr citeseerx dovidka angl Qian Ning and Terrence J Sejnowski Predicting the secondary structure of globular proteins using neural network models Journal of molecular biology 202 no 4 1988 865 884 angl Bohr Henrik Jakob Bohr Soren Brunak Rodney MJ Cotterill Benny Lautrup Leif Norskov Ole H Olsen and Steffen B Petersen Protein secondary structure and homology by neural networks The a helices in rhodopsin FEBS letters 241 1988 223 228 angl Rost Burkhard and Chris Sander Prediction of protein secondary structure at better than 70 accuracy Journal of molecular biology 232 no 2 1993 584 599 angl a b Schmidhuber Jurgen 1992 Learning complex extended sequences using the principle of history compression Neural Computation 4 2 234 242 doi 10 1162 neco 1992 4 2 234 angl Klejn O M 2023 Metod ta zasobi viyavlennya anomalij v kiberfizichnih sistemah komp yuternogo zoru kvalifikacijna robota magistra ukr Hmelnickij Hmelnickij nacionalnij universitet Schmidhuber Jurgen 1993 Habilitation Thesis angl Schmidhuber Jurgen 1 listopada 1992 Learning to control fast weight memories an alternative to recurrent nets Neural Computation 4 1 131 139 doi 10 1162 neco 1992 4 1 131 angl a b Schlag Imanol Irie Kazuki Schmidhuber Jurgen 2021 Linear Transformers Are Secretly Fast Weight Programmers ICML 2021 Springer s 9355 9366 angl Choromanski Krzysztof Likhosherstov Valerii Dohan David Song Xingyou Gane Andreea Sarlos Tamas Hawkins Peter Davis Jared ta in 2020 Rethinking Attention with Performers arXiv 2009 14794 cs CL angl Schmidhuber Jurgen 1993 Reducing the ratio between learning complexity and number of time varying variables in fully recurrent nets ICANN 1993 Springer s 460 463 angl Vaswani Ashish Shazeer Noam Parmar Niki Uszkoreit Jakob Jones Llion Gomez Aidan N Kaiser Lukasz Polosukhin Illia 2017 06 12 Attention Is All You Need arXiv 1706 03762 cs CL angl Wolf Thomas Debut Lysandre Sanh Victor Chaumond Julien Delangue Clement Moi Anthony Cistac Pierric Rault Tim Louf Remi Funtowicz Morgan Davison Joe Shleifer Sam von Platen Patrick Ma Clara Jernite Yacine Plu Julien Xu Canwen Le Scao Teven Gugger Sylvain Drame Mariama Lhoest Quentin Rush Alexander 2020 Transformers State of the Art Natural Language Processing Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing System Demonstrations s 38 45 doi 10 18653 v1 2020 emnlp demos 6 angl He Cheng 31 grudnya 2021 Transformer in CV Transformer in CV Towards Data Science angl Schmidhuber Jurgen 1991 A possibility for implementing curiosity and boredom in model building neural controllers Proc SAB 1991 MIT Press Bradford Books s 222 227 angl Schmidhuber Jurgen 2010 Formal Theory of Creativity Fun and Intrinsic Motivation 1990 2010 IEEE Transactions on Autonomous Mental Development 2 3 230 247 doi 10 1109 TAMD 2010 2056368 angl Schmidhuber Jurgen 2020 Generative Adversarial Networks are Special Cases of Artificial Curiosity 1990 and also Closely Related to Predictability Minimization 1991 Neural Networks angl 127 58 66 PMID 32334341 arXiv 1906 04493 doi 10 1016 j neunet 2020 04 008 angl a b Goodfellow Ian Pouget Abadie Jean Mirza Mehdi Xu Bing Warde Farley David Ozair Sherjil Courville Aaron Bengio Yoshua 2014 Generative Adversarial Networks Proceedings of the International Conference on Neural Information Processing Systems NIPS 2014 s 2672 2680 Arhiv originalu za 22 listopada 2019 Procitovano 20 serpnya 2019 angl Prepare Don t Panic Synthetic Media and Deepfakes witness org Arhiv originalu za 2 grudnya 2020 Procitovano 25 listopada 2020 angl GAN 2 0 NVIDIA s Hyperrealistic Face Generator SyncedReview com 14 grudnya 2018 Procitovano 3 zhovtnya 2019 angl Karras Tero Aila Timo Laine Samuli Lehtinen Jaakko 1 zhovtnya 2017 Progressive Growing of GANs for Improved Quality Stability and Variation arXiv 1710 10196 angl a b S Hochreiter Untersuchungen zu dynamischen neuronalen Netzen Arhivovano 2015 03 06 u Wayback Machine Diploma thesis Institut f Informatik Technische Univ Munich Advisor J Schmidhuber 1991 nim Hochreiter S 15 sichnya 2001 Gradient flow in recurrent nets the difficulty of learning long term dependencies U Kolen John F Kremer Stefan C A Field Guide to Dynamical Recurrent Networks John Wiley amp Sons ISBN 978 0 7803 5369 5 angl Hochreiter Sepp Schmidhuber Jurgen 1 listopada 1997 Long Short Term Memory Neural Computation 9 8 1735 1780 ISSN 0899 7667 PMID 9377276 doi 10 1162 neco 1997 9 8 1735 angl Schmidhuber J 2015 Deep Learning in Neural Networks An Overview Neural Networks 61 85 117 arXiv 1404 7828 doi 10 1016 j neunet 2014 09 003 angl Gers Felix Schmidhuber Jurgen Cummins Fred 1999 Learning to forget Continual prediction with LSTM 9th International Conference on Artificial Neural Networks ICANN 99 1999 s 850 855 ISBN 0 85296 721 7 doi 10 1049 cp 19991218 angl Srivastava Rupesh Kumar Greff Klaus Schmidhuber Jurgen 2 travnya 2015 Highway Networks arXiv 1505 00387 cs LG angl Srivastava Rupesh K Greff Klaus Schmidhuber Juergen 2015 Training Very Deep Networks Advances in Neural Information Processing Systems Curran Associates Inc 28 2377 2385 angl He Kaiming Zhang Xiangyu Ren Shaoqing Sun Jian 2016 Deep Residual Learning for Image Recognition 2016 IEEE Conference on Computer Vision and Pattern Recognition CVPR Las Vegas NV US IEEE 770 778 ISBN 978 1 4673 8851 1 arXiv 1512 03385 doi 10 1109 CVPR 2016 90 angl Mead Carver A Ismail Mohammed 8 travnya 1989 Analog VLSI Implementation of Neural Systems The Kluwer International Series in Engineering and Computer Science 80 Norwell MA Kluwer Academic Publishers en ISBN 978 1 4613 1639 8 doi 10 1007 978 1 4613 1639 8 Arhiv originalu za 6 listopada 2019 Procitovano 24 sichnya 2020 angl Domingos Pedro 22 veresnya 2015 chapter 4 The Master Algorithm How the Quest for the Ultimate Learning Machine Will Remake Our World Basic Books ISBN 978 0465065707 angl Smolensky P 1986 Information processing in dynamical systems Foundations of harmony theory U D E Rumelhart J L McClelland PDP Research Group Parallel Distributed Processing Explorations in the Microstructure of Cognition 1 s 194 281 ISBN 978 0 262 68053 0 angl Ng Andrew Dean Jeff 2012 Building High level Features Using Large Scale Unsupervised Learning arXiv 1112 6209 cs LG angl Ian Goodfellow and Yoshua Bengio and Aaron Courville 2016 Deep Learning MIT Press Arhiv originalu za 16 kvitnya 2016 Procitovano 1 chervnya 2016 angl Ciresan Dan Claudiu Meier Ueli Gambardella Luca Maria Schmidhuber Jurgen 21 veresnya 2010 Deep Big Simple Neural Nets for Handwritten Digit Recognition Neural Computation 22 12 3207 3220 ISSN 0899 7667 PMID 20858131 arXiv 1003 0358 doi 10 1162 neco a 00052 angl Dominik Scherer Andreas C Muller and Sven Behnke Evaluation of Pooling Operations in Convolutional Architectures for Object Recognition Arhivovano 3 kvitnya 2018 u Wayback Machine In 20th International Conference Artificial Neural Networks ICANN pp 92 101 2010 DOI 10 1007 978 3 642 15825 4 10 angl Interv yu Kurzweil AI 2012 roku Arhivovano 31 serpnya 2018 u Wayback Machine z Yurgenom Shmidhuberom pro visim zmagan vigranih jogo komandoyu Glibokogo navchannya v 2009 2012 rokah angl How bio inspired deep learning keeps winning competitions KurzweilAI www kurzweilai net Arhiv originalu za 31 serpnya 2018 Procitovano 16 chervnya 2017 angl a b Graves Alex Schmidhuber Jurgen 2009 Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks U Koller D Schuurmans Dale Bengio Yoshua ta in Advances in Neural Information Processing Systems 21 NIPS 2008 Neural Information Processing Systems NIPS Foundation s 545 552 ISBN 9781605609492 rekomenduyetsya displayeditors dovidka angl a b Graves A Liwicki M Fernandez S Bertolami R Bunke H Schmidhuber J May 2009 A Novel Connectionist System for Unconstrained Handwriting Recognition IEEE Transactions on Pattern Analysis and Machine Intelligence 31 5 855 868 ISSN 0162 8828 PMID 19299860 doi 10 1109 tpami 2008 137 Arhiv originalu za 2 sichnya 2014 Procitovano 30 lipnya 2014 Proignorovano nevidomij parametr citeseerx dovidka angl Ciresan Dan Meier U Schmidhuber J June 2012 Multi column deep neural networks for image classification 2012 IEEE Conference on Computer Vision and Pattern Recognition s 3642 3649 Bibcode 2012arXiv1202 2745C ISBN 978 1 4673 1228 8 arXiv 1202 2745 doi 10 1109 cvpr 2012 6248110 Proignorovano nevidomij parametr citeseerx dovidka angl a b Zell Andreas 2003 chapter 5 2 Simulation neuronaler Netze Simulation of Neural Networks nim vid 1st Addison Wesley ISBN 978 3 89319 554 1 OCLC 249017987 nim Artificial intelligence vid 3rd Addison Wesley Pub Co 1992 ISBN 0 201 53377 4 angl Abbod Maysam F 2007 Application of Artificial Intelligence to the Management of Urological Cancer The Journal of Urology 178 4 1150 1156 PMID 17698099 doi 10 1016 j juro 2007 05 122 angl Dawson Christian W 1998 An artificial neural network approach to rainfall runoff modelling Hydrological Sciences Journal 43 1 47 66 doi 10 1080 02626669809492102 angl The Machine Learning Dictionary www cse unsw edu au Arhiv originalu za 26 serpnya 2018 Procitovano 4 listopada 2009 angl Ciresan Dan Ueli Meier Jonathan Masci Luca M Gambardella Jurgen Schmidhuber 2011 Flexible High Performance Convolutional Neural Networks for Image Classification Proceedings of the Twenty Second International Joint Conference on Artificial Intelligence Volume Volume Two 2 1237 1242 Arhiv originalu za 5 kvitnya 2022 Procitovano 7 lipnya 2022 angl Zell Andreas 1994 Simulation Neuronaler Netze Simulation of Neural Networks nim vid 1st Addison Wesley s 73 ISBN 3 89319 554 8 Miljanovic Milos February March 2012 Comparative analysis of Recurrent and Finite Impulse Response Neural Networks in Time Series Prediction Indian Journal of Computer and Engineering 3 1 angl Lau Suki 10 lipnya 2017 A Walkthrough of Convolutional Neural Network Hyperparameter Tuning Medium Arhiv originalu za 4 lyutogo 2023 Procitovano 23 serpnya 2019 angl Kelleher John D Mac Namee Brian D Arcy Aoife 2020 7 8 Fundamentals of machine learning for predictive data analytics algorithms worked examples and case studies vid 2nd Cambridge MA ISBN 978 0 262 36110 1 OCLC 1162184998 angl Wei Jiakai 26 kvitnya 2019 Forget the Learning Rate Decay Loss arXiv 1905 00094 cs LG angl Li Y Fu Y Li H Zhang S W 1 chervnya 2009 The Improved Training Algorithm of Back Propagation Neural Network with Self adaptive Learning Rate 2009 International Conference on Computational Intelligence and Natural Computing 1 s 73 76 ISBN 978 0 7695 3645 3 doi 10 1109 CINC 2009 111 angl Huang Guang Bin Zhu Qin Yu Siew Chee Kheong 2006 Extreme learning machine theory and applications Neurocomputing 70 1 489 501 doi 10 1016 j neucom 2005 12 126 Proignorovano nevidomij parametr citeseerx dovidka angl Widrow Bernard 2013 The no prop algorithm A new learning algorithm for multilayer neural networks Neural Networks 37 182 188 PMID 23140797 doi 10 1016 j neunet 2012 09 020 angl Ollivier Yann Charpiat Guillaume 2015 Training recurrent networks without backtracking arXiv 1507 07680 cs NE angl Hinton G E 2010 A Practical Guide to Training Restricted Boltzmann Machines Tech Rep UTML TR 2010 003 Arhiv originalu za 9 travnya 2021 Procitovano 27 chervnya 2017 angl ESANN 2009 Bernard Etienne 2021 Introduction to machine learning angl Champaign s 9 ISBN 978 1579550486 Procitovano 22 bereznya 2023 a b v g Sinyeglazov Viktor Chumachenko Olena 2022 U Bidyuk P I Shugalej L P Metodi ta tehnologiyi napivkerovanogo navchannya Kurs lekcij ukr Kiyiv NTUU KPI im Igorya Sikorskogo a b v g Duda O M Kunanec N E Macyuk O V Pasichnik V V 21 27 travnya 2018 Metodi analitichnogo opracyuvannya big data Intelektualni sistemi prijnyattya rishen ta problemi obchislyuvalnogo intelektu ukr Zaliznij Port s 159 ISBN 978 617 7573 17 2 a b v g Kropivnicka V B Magas D M 30 kvitnya 2023 Napivkerovane mashinne navchannya dlya viyavlennya nespravnostej naftogazoprovodiv Modern engineering and innovative technologies ukr 1 18 33 36 doi 10 30890 2567 5273 2023 26 01 010 Bernard Etienne 2021 Introduction to machine learning angl Champaign s 12 ISBN 978 1579550486 Procitovano 22 bereznya 2023 a b V yunenko O B Viganyajlo S M 12 travnya 2021 U Sokurenko V V Shvec D V Mogilevskij L V Shulga V P Yakovlyev R P Shmelov Yu M Innovaciyi ta zagalni problemi pidvishennya rivnya kiberbezpeki II Mizhnarodna naukovo praktichna konferenciya Aviaciya promislovist suspilstvo ukr 1 MVS Ukrayini Harkivskij nacionalnij universitet vnutrishnih sprav Kremenchuckij lotnij koledzh s 169 ISBN 978 966 610 243 3 Bernard Etienne 2021 Introduction to Machine Learning Wolfram Media Inc s 9 ISBN 978 1 579550 48 6 angl Horoshilov S V Redka M O 2019 Intelektualne keruvannya oriyentaciyeyu kosmichnih aparativ iz vikoristannyam navchannya z pidkriplennyam Tehnichna mehanika ukr Dnipro Institut tehnichnoyi mehaniki Nacionalnoyi akademiyi nauk Ukrayini ta Derzhavnogo kosmichnogo agentstva Ukrayini 4 doi 10 15407 itm2019 04 029 Ojha Varun Kumar Abraham Ajith Snasel Vaclav 1 kvitnya 2017 Metaheuristic design of feedforward neural networks A review of two decades of research Engineering Applications of Artificial Intelligence 60 97 116 Bibcode 2017arXiv170505584O arXiv 1705 05584 doi 10 1016 j engappai 2017 01 013 angl Dominic S Das R Whitley D Anderson C July 1991 Genetic reinforcement learning for neural networks IJCNN 91 Seattle International Joint Conference on Neural Networks IJCNN 91 Seattle International Joint Conference on Neural Networks Seattle Washington US IEEE s 71 76 ISBN 0 7803 0164 1 doi 10 1109 IJCNN 1991 155315 angl Hoskins J C Himmelblau D M 1992 Process control via artificial neural networks and reinforcement learning Computers amp Chemical Engineering 16 4 241 251 doi 10 1016 0098 1354 92 80045 B angl Bertsekas D P Tsitsiklis J N 1996 Neuro dynamic programming Athena Scientific s 512 ISBN 978 1 886529 10 6 Arhiv originalu za 29 chervnya 2017 Procitovano 17 chervnya 2017 angl Secomandi Nicola 2000 Comparing neuro dynamic programming algorithms for the vehicle routing problem with stochastic demands Computers amp Operations Research 27 11 12 1201 1225 doi 10 1016 S0305 0548 99 00146 X Proignorovano nevidomij parametr citeseerx dovidka angl de Rigo D Rizzoli A E Soncini Sessa R Weber E Zenesi P 2001 Neuro dynamic programming for the efficient management of reservoir networks Proceedings of MODSIM 2001 International Congress on Modelling and Simulation MODSIM 2001 International Congress on Modelling and Simulation Canberra Australia Modelling and Simulation Society of Australia and New Zealand ISBN 0 86740 525 2 doi 10 5281 zenodo 7481 Arhiv originalu za 7 August 2013 Procitovano 29 lipnya 2013 angl Damas M Salmeron M Diaz A Ortega J Prieto A Olivares G 2000 Genetic algorithms and neuro dynamic programming application to water supply networks Proceedings of 2000 Congress on Evolutionary Computation 2000 Congress on Evolutionary Computation 1 La Jolla California US IEEE s 7 14 ISBN 0 7803 6375 2 doi 10 1109 CEC 2000 870269 angl Deng Geng Ferris M C 2008 Neuro dynamic programming for fractionated radiotherapy planning Springer Optimization and Its Applications 12 s 47 70 ISBN 978 0 387 73298 5 doi 10 1007 978 0 387 73299 2 3 Proignorovano nevidomij parametr citeseerx dovidka angl Bozinovski S 1982 A self learning system using secondary reinforcement In R Trappl ed Cybernetics and Systems Research Proceedings of the Sixth European Meeting on Cybernetics and Systems Research North Holland pp 397 402 ISBN 978 0 444 86488 8 angl Bozinovski S 2014 Modeling mechanisms of cognition emotion interaction in artificial neural networks since 1981 Arhivovano 23 bereznya 2019 u Wayback Machine Procedia Computer Science p 255 263 angl Bozinovski Stevo Bozinovska Liljana 2001 Self learning agents A connectionist theory of emotion based on crossbar value judgment Cybernetics and Systems 32 6 637 667 doi 10 1080 01969720118145 angl Welcoming the Era of Deep Neuroevolution Uber Blog 18 grudnya 2017 Procitovano 15 kvitnya 2023 angl Artificial intelligence can evolve to solve problems Science AAAS 10 sichnya 2018 Arhiv originalu za 9 grudnya 2021 Procitovano 7 lyutogo 2018 angl Turchetti Claudio 2004 Stochastic Models of Neural Networks Frontiers in artificial intelligence and applications Knowledge based intelligent engineering systems 102 IOS Press ISBN 9781586033880 angl Jospin Laurent Valentin Laga Hamid Boussaid Farid Buntine Wray Bennamoun Mohammed 2022 Hands On Bayesian Neural Networks A Tutorial for Deep Learning Users IEEE Computational Intelligence Magazine 17 2 29 48 ISSN 1556 603X arXiv 2007 06823 doi 10 1109 mci 2022 3155327 Arhiv originalu za 4 lyutogo 2023 Procitovano 19 listopada 2022 angl de Rigo D Castelletti A Rizzoli A E Soncini Sessa R Weber E January 2005 A selective improvement technique for fastening Neuro Dynamic Programming in Water Resources Network Management U Pavel Zitek Proceedings of the 16th IFAC World Congress IFAC PapersOnLine 16th IFAC World Congress 16 Prague Czech Republic IFAC s 7 12 ISBN 978 3 902661 75 3 doi 10 3182 20050703 6 CZ 1902 02172 Arhiv originalu za 26 kvitnya 2012 Procitovano 30 grudnya 2011 Proignorovano nevidomij parametr hdl dovidka angl Ferreira C 2006 Designing Neural Networks Using Gene Expression Programming U A Abraham B de Baets M Koppen ta in Applied Soft Computing Technologies The Challenge of Complexity Springer Verlag s 517 536 Arhiv originalu za 19 grudnya 2013 Procitovano 8 zhovtnya 2012 rekomenduyetsya displayeditors dovidka angl Da Y Xiurun G July 2005 An improved PSO based ANN with simulated annealing technique U T Villmann New Aspects in Neurocomputing 11th European Symposium on Artificial Neural Networks 63 Elsevier s 527 533 doi 10 1016 j neucom 2004 07 002 Arhiv originalu za 25 kvitnya 2012 Procitovano 30 grudnya 2011 angl Wu J Chen E May 2009 A Novel Nonparametric Regression Ensemble for Rainfall Forecasting Using Particle Swarm Optimization Technique Coupled with Artificial Neural Network U Wang H Shen Y Huang T ta in 6th International Symposium on Neural Networks ISNN 2009 Lecture Notes in Computer Science 5553 Springer s 49 58 ISBN 978 3 642 01215 0 doi 10 1007 978 3 642 01513 7 6 Arhiv originalu za 31 grudnya 2014 Procitovano 1 sichnya 2012 rekomenduyetsya displayeditors dovidka angl a b Ting Qin Zonghai Chen Haitao Zhang Sifu Li Wei Xiang Ming Li 2004 A learning algorithm of CMAC based on RLS Neural Processing Letters 19 1 49 61 doi 10 1023 B NEPL 0000016847 18175 60 Arhiv originalu za 14 kvitnya 2021 Procitovano 30 sichnya 2019 angl Ting Qin Haitao Zhang Zonghai Chen Wei Xiang 2005 Continuous CMAC QRLS and its systolic array Neural Processing Letters 22 1 1 16 doi 10 1007 s11063 004 2694 0 Arhiv originalu za 18 listopada 2018 Procitovano 30 sichnya 2019 angl Backpropagation Applied to Handwritten Zip Code Recognition Neural Computation 1 4 541 551 1989 doi 10 1162 neco 1989 1 4 541 Proignorovano nevidomij parametr vauthors dovidka angl Yann LeCun 2016 Slides on Deep Learning Online Arhivovano 23 kvitnya 2016 u Wayback Machine angl Hochreiter Sepp Schmidhuber Jurgen 1 listopada 1997 Long Short Term Memory Neural Computation 9 8 1735 1780 ISSN 0899 7667 PMID 9377276 doi 10 1162 neco 1997 9 8 1735 angl Sak Hasim Senior Andrew Beaufays Francoise 2014 Long Short Term Memory recurrent neural network architectures for large scale acoustic modeling Arhiv originalu za 24 kvitnya 2018 angl Li Xiangang Wu Xihong 15 zhovtnya 2014 Constructing Long Short Term Memory based Deep Recurrent Neural Networks for Large Vocabulary Speech Recognition arXiv 1410 4281 cs CL angl Fan Y Qian Y Xie F Soong F K 2014 TTS synthesis with bidirectional LSTM based Recurrent Neural Networks Proceedings of the Annual Conference of the International Speech Communication Association Interspeech 1964 1968 Procitovano 13 chervnya 2017 angl Zen Heiga Sak Hasim 2015 Unidirectional Long Short Term Memory Recurrent Neural Network with Recurrent Output Layer for Low Latency Speech Synthesis Google com ICASSP s 4470 4474 Arhiv originalu za 9 travnya 2021 Procitovano 27 chervnya 2017 angl Fan Bo Wang Lijuan Soong Frank K Xie Lei 2015 Photo Real Talking Head with Deep Bidirectional LSTM Proceedings of ICASSP Arhiv originalu za 1 listopada 2017 Procitovano 27 chervnya 2017 angl Silver David Hubert Thomas Schrittwieser Julian Antonoglou Ioannis Lai Matthew Guez Arthur Lanctot Marc Sifre Laurent ta in 5 grudnya 2017 Mastering Chess and Shogi by Self Play with a General Reinforcement Learning Algorithm arXiv 1712 01815 cs AI angl Zoph Barret Le Quoc V 4 listopada 2016 Neural Architecture Search with Reinforcement Learning arXiv 1611 01578 cs LG angl Haifeng Jin Qingquan Song Xia Hu 2019 Auto keras An efficient neural architecture search system Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery amp Data Mining ACM arXiv 1806 10282 Arhiv originalu za 21 serpnya 2019 Procitovano 21 serpnya 2019 cherez autokeras com angl Claesen Marc De Moor Bart 2015 Hyperparameter Search in Machine Learning arXiv 1502 02127 cs LG Bibcode 2015arXiv150202127C angl Probst Philipp Boulesteix Anne Laure Bischl Bernd 26 lyutogo 2018 Tunability Importance of Hyperparameters of Machine Learning Algorithms J Mach Learn Res 20 53 1 53 32 Procitovano 18 bereznya 2023 span