Підтримка
www.wikidata.uk-ua.nina.az
V mashinnomu navchanni ta statistici obira nnya ozna k vidome takozh yak obira nnya zmi nnih obira nnya atribu tiv ta obira nnya pidmnozhini zmi nnih angl feature selection variable selection attribute selection variable subset selection ce proces obirannya pidmnozhini dorechnih oznak zminnih provisnikiv dlya vikoristannya v pobudovi modeli Metodiki obirannya oznak zastosovuyut iz dekilkoma cilyami sproshennya modelej shobi zrobiti yih legshimi dlya interpretuvannya doslidnikami koristuvachami skorochennya trivalostej trenuvannya unikannya proklyattya rozmirnosti pokrashennya uzagalnennya shlyahom znizhennya perenavchannya formalno znizhennya dispersiyi Centralnoyu peredumovoyu pri zastosuvanni metodiki obirannya oznak ye te sho dani mistyat deyaki oznaki sho ye abo nadlishkovimi abo nedorechnimi i tomu yih mozhe buti usuneno bez sprichinennya znachnoyi vtrati informaciyi Nadlishkovi ta nedorechni ye dvoma riznimi ponyattyami oskilki odna dorechna oznaka mozhe buti nadlishkovoyu v prisutnosti inshoyi dorechnoyi oznaki z yakoyu vona silno korelyuye Metodiki obirannya oznak slid vidriznyati vid vidilyannya oznak Vidilyannya oznak stvoryuye novi oznaki z funkcij vid pervinnih oznak todi yak obirannya oznak povertaye pidmnozhinu oznak Metodiki obirannya oznak chasto vikoristovuyut u tih oblastyah de ye bagato oznak i porivnyano malo zrazkiv abo tochok danih Do spokonvichnih vipadkiv zastosuvannya obirannya oznak nalezhat analiz pisanih tekstiv ta danih DNK mikrochipiv de ye bagato tisyach oznak i lishe vid dekilkoh desyatkiv do soten zrazkiv VvedennyaAlgoritm obirannya oznak mozhlivo rozglyadati yak poyednannya metodiki poshuku dlya proponuvannya novih pidmnozhin oznak razom iz miroyu ocinki yaka vstanovlyuye bali riznim pidmnozhinam oznak Najprostishim algoritmom ye pereviryati kozhnu mozhlivu pidmnozhinu oznak shukayuchi taku yaka minimizuye riven pohibki Ce ye vicherpnim poshukom cim prostorom i ye obchislyuvalno nepiddatlivim dlya bud yakih mnozhin oznak krim najmenshih Vibir ocinyuvalnoyi metriki silno vplivaye na algoritm i same ci ocinyuvalni metriki vidriznyayut tri osnovni kategoriyi algoritmiv poshuku oznak obgortkovi filtrovi ta vbudovani metodi Obgortkovi metodi dlya vstanovlennya baliv pidmnozhinam oznak vikoristovuyut peredbachuvalnu model Kozhnu novu pidmnozhinu vikoristovuyut dlya trenuvannya modeli yaku pereviryayut na pritrimanij mnozhini Pidrahunok kilkosti pomilok zroblenih na cij pritrimanij mnozhini riven pohibki modeli daye bal ciyeyi pidmnozhini Oskilki obgortkovi metodi trenuyut novu model dlya kozhnoyi pidmnozhini voni ye duzhe obchislyuvalno napruzhenimi ale zazvichaj proponuyut mnozhinu oznak iz najkrashoyu produktivnistyu dlya togo okremogo tipu modeli Filtrovi metodi dlya vstanovlennya baliv pidmnozhinam oznak zamist rivnya pohibki vikoristovuyut miru zastupnicyu Cyu miru obirayut takoyu shobi vona bula shvidkoyu ale vse she shoplyuvala korisnist mnozhini oznak Do poshirenih mir nalezhat vzayemna informaciya potochkova vzayemna informaciya koeficiyent korelyaciyi Pirsona en ta vnutrishno mizhklasova vidstan abo bali kriteriyiv znachushosti dlya kozhnoyi kombinaciyi klas oznaka Filtri zazvichaj ye mensh napruzhenimi obchislyuvalno za obgortki ale voni proponuyut nabir oznak sho ne nalashtovano na konkretnij tip peredbachuvalnoyi modeli Cej brak nalashtuvannya oznachaye sho nabir oznak vid filtra ye zagalnishim za nabir vid obgortki zazvichaj dayuchi nizhchu peredbachuvalnu produktivnist nizh obgortka Prote takij nabir oznak ne mistit pripushen peredbachuvalnoyi modeli i tomu ye korisnishim dlya rozkrittya vzayemozv yazkiv mizh oznakami Bagato filtriv proponuyut ranzhuvannya oznak zamist yavnoyi najkrashoyi pidmnozhini oznak a tochku vidsikannya v rangu obirayut za dopomogoyu perehresnogo zatverdzhuvannya Filtrovi metodi takozh zastosovuvali yak peredobrobnij etap dlya obgortkovih metodiv roblyachi mozhlivim zastosuvannya obgortok do bilshih zadach Odnim z inshih populyarnih pidhodiv ye algoritm rekursivnogo usunennya oznak angl Recursive Feature Elimination zazvichaj zastosovuvanij z metodom opornih vektoriv dlya povtornoyi pobudovi modeli ta usunennya oznak z nizkimi vagovimi koeficiyentami Vbudovani metodi ye vseosyazhnoyu grupoyu metodik sho vikonuyut obirannya oznak yak chastinu procesu pobudovi modeli Primirnikom cogo pidhodu ye metod pobudovi linijnih modelej en yakij shtrafuye koeficiyenti regresiyi shtrafom L1 skorochuyuchi bagato z nih do nulya Bud yaki oznaki sho mayut nenulovi koeficiyenti regresiyi ye obranimi algoritmom LASSO Do pokrashen LASSO nalezhat Bolasso yake butstrepuye vibirki en yaka poyednuye shtraf L1 LASSO iz shtrafom L2 grebenevoyi regresiyi ta FeaLect yake vstanovlyuye bali vsim oznakam na osnovi kombinatornogo analizu koeficiyentiv regresiyi Ci pidhodi z poglyadu obchislyuvalnoyi skladnosti tyazhiyut do znahodzhennya mizh filtrami ta obgortkami U tradicijnomu regresijnomu analizi najpopulyarnishim vidom obirannya oznak ye en yaka ye obgortkovoyu metodikoyu Ce zhadibnij algoritm sho pid chas kozhnogo raundu dodaye najkrashu oznaku abo vidalyaye najgirshu Golovnim pitannyam u keruvanni ye virishuvannya koli zupiniti algoritm U mashinnomu navchanni ce zazvichaj zdijsnyuyut za dopomogoyu perehresnogo zatverdzhuvannya U statistici deyaki kriteriyi ye optimizovanimi Ce prizvodit do problemi pritamannoyi vkladenosti Bulo doslidzheno nadijnishi metodi taki yak gilki i mezhi ta kuskovo linijna merezha Obirannya pidmnozhinObirannya pidmnozhini ocinyuye na pridatnist pidmnozhinu oznak yak grupu Algoritmi obirannya pidmnozhin mozhlivo rozdiliti na obgortkovi filtrovi ta vbudovani metodi Obgortkovi vikoristovuyut poshukovij algoritm dlya poshuku prostorom mozhlivih oznak i ocinki kozhnoyi pidmnozhini zapuskom modeli na nij Obgortki mozhut buti obchislyuvalno vitratnimi i mati rizik perenavchitisya modeli Filtri ye shozhimi na obgortki v pidhodi do poshuku ale zamist ocinki proti modeli vivodyat prostishij filtr Vbudovani metodiki ye vbudovanimi v model i osoblivimi dlya modeli Bagato populyarnih pidhodiv do poshuku vikoristovuyut zhadibnij pidjom shilom sho iterativno ocinyuye kandidata pidmnozhinu oznak a potim zminyuye cyu pidmnozhinu j ocinyuye chi ye nova pidmnozhina pokrashennyam vidnosno staroyi Ocinka pidmnozhin vimagaye ocinkovoyi metriki sho realizuye gradaciyu pidmnozhin oznak Vicherpnij poshuk v zagalnomu vipadku ye nepraktichnim tomu na pevnij viznachenij realizatorom abo operatorom tochci zupinki pidmnozhinu oznak iz najvishim znajdenim do ciyeyi tochki balom obirayut zadovilnoyu pidmnozhinoyu oznak Kriterij zupinki zalezhit vid algoritmu do mozhlivih kriteriyiv nalezhat perevishennya porogu balom pidmnozhini perevishennya maksimalnogo dopustimogo chasu vikonannya programi tosho Alternativni metodiki na osnovi poshuku gruntuyutsya na en yakij znahodit nizkorozmirni proyekciyi danih sho mayut visokij bal todi obirayut oznaki sho mayut najbilshi proyekciyi v nizkorozmirnomu prostori Do poshukovih pidhodiv nalezhat Vicherpnij Pershij lipshij Imitaciya vidpalu Genetichnij algoritm Zhadibne postupalne obirannya angl Greedy forward selection Zhadibne zvorotne usunennya angl Greedy backward elimination Metod royu chastok en Poshuk vrozsip angl Scatter Search en Dvoma populyarnimi filtrovimi metrikami dlya zadach klasifikaciyi ye korelyaciya ta vzayemna informaciya hocha zhodna z nih ne ye spravzhnoyu metrikoyu abo miroyu vidstani v matematichnomu sensi oskilki voni ne pidkoryayutsya nerivnosti trikutnika i vidtak ne obchislyuyut zhodnoyi dijsnoyi vidstani yih slid bulo bi shvidshe rozglyadati yak bali Ci bali obchislyuyut mizh oznakami kandidatami abo naborami oznak ta bazhanoyu kategoriyeyu vihodu Prote isnuyut i spravzhni metriki sho ye prostoyu funkciyeyu vzayemnoyi informaciyi div tut Do inshih dostupnih filtrovih metrik nalezhat Viddilnist klasiv angl class separability Imovirnist pohibki angl error probability Mizhklasova vidstan angl inter class distance Imovirnisna vidstan angl probabilistic distance Informacijna entropiya Obirannya oznak na osnovi uzgodzhenosti angl consistency based feature selection Obirannya oznak na osnovi korelyaciyi angl correlation based feature selection Kriterij optimalnostiObirannya kriteriyu optimalnosti ye skladnim oskilki v zavdannya obirannya oznak ye kilka cilej Bagato poshirenih kriteriyiv vklyuchayut miru tochnosti shtrafovanu kilkistyu obranih oznak Do prikladiv nalezhat informacijnij kriterij Akaike IKA angl AIC ta en sho mayut shtraf u 2 displaystyle 2 dlya kozhnoyi dodanoyi oznaki IKA gruntuyetsya na teoriyi informaciyi j praktichno vivoditsya z en Inshimi kriteriyami ye bayesiv informacijnij kriterij BIK angl BIC sho vikoristovuye shtraf u log n displaystyle sqrt log n dlya kozhnoyi dodanoyi oznaki princip minimalnoyi dovzhini opisu MDO angl MDL sho asimptotichno vikoristovuye log n displaystyle sqrt log n en KRR angl RIC yakij vikoristovuye 2 log p displaystyle sqrt 2 log p obirannya oznak maksimalnoyi zalezhnosti angl maximum dependency feature selection ta ryad novih kriteriyiv obgruntovanih en angl false discovery rate FDR yakij vikoristovuye shos blizke do 2 log p q displaystyle sqrt 2 log frac p q Dlya obirannya najvidpovidnishoyi mnozhini oznak mozhna takozh vikoristovuvati j kriterij maksimalnoyi entropijnoyi shvidkosti Navchannya strukturiFiltrove obirannya oznak ye okremim vipadkom zagalnishoyi paradigmi sho nazivayut navchannyam strukturi Obirannya oznak znahodit dorechnij nabir oznak dlya konkretnoyi cilovoyi zminnoyi todi yak navchannya strukturi znahodit vzayemozv yazki mizh usima zminnimi zazvichaj virazhayuchi ci vzayemozv yazki yak graf Najposhirenishi algoritmi navchannya strukturi vihodyat iz togo sho dani bulo porodzheno bayesovoyu merezheyu i tomu strukturoyu ye oriyentovana grafova model Optimalnim rozv yazkom zadachi filtrovogo obirannya oznak ye markovske pokrittya cilovogo vuzla i v bayesovij merezhi isnuye yedine markovske pokrittya dlya kozhnogo vuzla Mehanizmi obirannya oznak na osnovi teoriyi informaciyiIsnuyut rizni mehanizmi obirannya oznak sho dlya ocinyuvannya riznih oznak vikoristovuyut vzayemnu informaciyu Voni zazvichaj vikoristovuyut odin i toj zhe algoritm Obchisliti vzayemnu informaciyu yak ocinku mizh usima oznakami f i F displaystyle f i in F ta cilovim klasom c displaystyle c Obrati oznaku z najvishim balom napriklad a r g m a x f i F I f i c displaystyle argmax f i in F I f i c i dodati yiyi do naboru obranih oznak S displaystyle S Obchisliti pohidnu ocinku yaku mozhe buti vivedeno iz vzayemnoyi informaciyi Obrati oznaku z najvishim balom i dodati yiyi do naboru obranih oznak napriklad a r g m a x f i F I d e r i v e d f i c displaystyle argmax f i in F I derived f i c Povtoryuvati 3 ta 4 dopoki ne bude obrano pevne chislo oznak napriklad S l displaystyle S l Najprostishij pidhid yak cyu pohidnu ocinku vikoristovuye vlasne vzayemnu informaciyu Prote isnuyut rizni pidhodi sho namagayutsya znizhuvati nadlishkovist sered oznak Obirannya oznak minimalnoyi nadlishkovosti maksimalnoyi dorechnosti mRMR Pen ta in zaproponuvali metod obirannya oznak sho mozhe vikoristovuvati dlya obirannya oznak abo vzayemnu informaciyu korelyaciyu abo vidstan bali shozhosti Meta polyagaye u shtrafuvanni dorechnosti angl relevancy oznaki yiyi nadlishkovistyu angl redundancy v prisutnosti inshih obranih oznak Dorechnist naboru oznak S dlya klasu c viznachayut userednenim znachennyam usih znachen vzayemnoyi informaciyi mizh okremoyu oznakoyu fi ta klasom c takim chinom D S c 1 S f i S I f i c displaystyle D S c frac 1 S sum f i in S I f i c Nadlishkovistyu vsih oznak u mnozhini S ye userednene znachennya vsih znachen vzayemnoyi informaciyi mizh oznakoyu fi ta oznakoyu fj R S 1 S 2 f i f j S I f i f j displaystyle R S frac 1 S 2 sum f i f j in S I f i f j Kriterij minimalnoyi nadlishkovosti maksimalnoyi dorechnosti angl minimum redundancy maximum relevance mRMR ye poyednannyam dvoh navedenih vishe mir i jogo viznacheno nastupnim chinom m R M R max S 1 S f i S I f i c 1 S 2 f i f j S I f i f j displaystyle mathrm mRMR max S left frac 1 S sum f i in S I f i c frac 1 S 2 sum f i f j in S I f i f j right Pripustimo sho ye n oznak povnoyi mnozhini Nehaj xi bude indikatornoyu funkciyeyu chlenstva oznaki fi v mnozhini takoyu sho xi 1 pokazuye nayavnist a xi 0 pokazuye vidsutnist oznaki fi v globalno optimalnij mnozhini oznak Nehaj c i I f i c displaystyle c i I f i c a a i j I f i f j displaystyle a ij I f i f j Navedene vishe mozhe buti zapisano yak zadachu optimizaciyi m R M R max x 0 1 n i 1 n c i x i i 1 n x i i j 1 n a i j x i x j i 1 n x i 2 displaystyle mathrm mRMR max x in 0 1 n left frac sum i 1 n c i x i sum i 1 n x i frac sum i j 1 n a ij x i x j sum i 1 n x i 2 right Algoritm mRMR ye nablizhennyam teoretichno optimalnogo maksimalno zalezhnisnogo algoritmu obirannya oznak yakij maksimizuye vzayemnu informaciyu mizh spilnim rozpodilom obranih oznak ta zminnoyu klasifikaciyi Oskilki mRMR nablizhuye zadachu kombinatornoyi ocinki ryadom nabagato menshih zadach kozhna z yakih vklyuchaye lishe dvi zminni vin takim chinom vikoristovuye poparni spilni jmovirnosti sho ye nadijnishimi V deyakih situaciyah cej algoritm mozhe nedoocinyuvati korisnist oznak oskilki vin ne maye sposobu vimiryuvati vzayemodiyi mizh oznakami sho mozhut zbilshuvati dorechnist Ce mozhe prizvoditi do poganoyi produktivnosti koli oznaki ye marnimi poodinci ale korisnimi u poyednanni patologichnij vipadok traplyayetsya koli klas ye en oznak V cilomu cej algoritm ye efektivnishim u terminah kilkosti potribnoyi informaciyi za teoretichno optimalne maksimalno zalezhnisne obirannya hocha j viroblyaye nabir oznak iz nevelikoyu poparnoyu nadlishkovistyu mRMR ye primirnikom bilshogo klasu filtrovih metodiv yaki shukayut kompromis mizh dorechnistyu ta nadlishkovistyu vidminnimi shlyahami Obirannya oznak kvadratichnim programuvannyam mRMR ye tipovim prikladom prirostovoyi zhadibnoyi strategiyi obirannya oznak shojno oznaku bulo obrano yiyi ne mozhe buti viklyucheno na piznishomu etapi I hocha mRMR mozhe buti optimizovano zastosuvannyam ruhlivogo poshuku angl floating search dlya skorochuvannya deyakih oznak jogo takozh mozhe buti pereformulovano yak zadachu globalnoyi optimizaciyi kvadratichnogo programuvannya angl quadratic programming feature selection QPFS takim chinom Q P F S min x a x T H x x T F displaystyle mathrm QPFS min mathbf x left alpha mathbf x T H mathbf x mathbf x T F right quad t sh i 1 n x i 1 x i 0 displaystyle sum i 1 n x i 1 x i geq 0 de F n 1 I f 1 c I f n c T displaystyle F n times 1 I f 1 c ldots I f n c T ye vektorom dorechnosti oznak za pripushennya sho zagalom ye n oznak H n n I f i f j i j 1 n displaystyle H n times n I f i f j i j 1 ldots n ye matriceyu poparnoyi nadlishkovosti oznak a x n 1 displaystyle mathbf x n times 1 predstavlyaye vidnosni vagi oznak QPFS rozv yazuyetsya za dopomogoyu kvadratichnogo programuvannya Neshodavno bulo pokazano sho QPFS maye shilnist do oznak iz menshoyu entropiyeyu tomu sho vono stavit chlen samonadlishkovosti oznaki I f i f i displaystyle I f i f i na diagonal H Umovna vzayemna informaciya Insha ocinka sho vivodyat iz vzayemnoyi informaciyi gruntuyetsya na umovnij dorechnosti S P E C C M I max x x T Q x displaystyle mathrm SPEC CMI max mathbf x left mathbf x T Q mathbf x right quad t sh x 1 x i 0 displaystyle mathbf x 1 x i geq 0 de Q i i I f i c displaystyle Q ii I f i c a Q i j I f i c f j i j displaystyle Q ij I f i c f j i neq j Perevaga SPECCMI polyagaye v tomu sho vono mozhe rozv yazuvatisya prosto znahodzhennyam dominantnogo vlasnogo vektora Q i tomu ye duzhe masshtabovuvanim Takozh SPECCMI obroblyaye vzayemodiyi oznak drugogo poryadku Spilna vzayemna informaciya U doslidzhenni riznih baliv Braun ta in rekomenduvali yak dobrij bal dlya obirannya oznak Cej bal namagayetsya znajti oznaku sho dodaye najbilshe novoyi informaciyi do vzhe vidomih oznak shobi unikati nadmirnosti Cej bal sformulovano nastupnim chinom J M I f i f j S I f i c I f i c f j f j S I f j c I f i c I f i f j I f i f j c displaystyle begin aligned JMI f i amp sum f j in S I f i c I f i c f j amp sum f j in S bigl I f j c I f i c bigl I f i f j I f i f j c bigr bigr end aligned Cej bal vikoristovuye en dlya ocinyuvannya nadmirnosti mizh vzhe obranimi oznakami f j S displaystyle f j in S ta doslidzhuvanoyu oznakoyu f i displaystyle f i Obirannya oznak na osnovi LASSO kriteriyu nezalezhnosti Gilberta ShmidtaDlya danih iz visokoyu rozmirnistyu ta maloyu vibirkoyu napriklad rozmirnist gt 105 a kilkist zrazkiv lt 103 korisnim ye en kriteriyu nezalezhnosti Gilberta Shmidta angl Hilbert Schmidt Independence Criterion Lasso HSIC Lasso Zadachu optimizaciyi HSIC Lasso zadayut yak H S I C L a s s o min x 1 2 k l 1 n x k x l HSIC f k f l k 1 n x k HSIC f k c l x 1 displaystyle mathrm HSIC Lasso min mathbf x frac 1 2 sum k l 1 n x k x l mbox HSIC f k f l sum k 1 n x k mbox HSIC f k c lambda mathbf x 1 quad t sh x 1 x n 0 displaystyle x 1 ldots x n geq 0 de HSIC f k c tr K k L displaystyle mbox HSIC f k c mbox tr bar mathbf K k bar mathbf L ye miroyu nezalezhnosti na yadrovij osnovi sho nazivayut empirichnim kriteriyem nezalezhnosti Gilberta Shmidta angl Hilbert Schmidt independence criterion HSIC tr displaystyle mbox tr cdot poznachaye slid l displaystyle lambda ye regulyarizacijnim parametrom K k G K k G displaystyle bar mathbf K k mathbf Gamma mathbf K k mathbf Gamma ta L G L G displaystyle bar mathbf L mathbf Gamma mathbf L mathbf Gamma ye centrovanimi matricyami Grama vhodu ta vihodu K i j k K u k i u k j displaystyle K i j k K u k i u k j ta L i j L c i c j displaystyle L i j L c i c j ye matricyami Grama K u u displaystyle K u u ta L c c displaystyle L c c ye yadrovimi funkciyami G I m 1 m 1 m 1 m T displaystyle mathbf Gamma mathbf I m frac 1 m mathbf 1 m mathbf 1 m T ye centruvalnoyu matriceyu I m displaystyle mathbf I m ye m vimirnoyu odinichnoyu matriceyu m kilkist zrazkiv 1 m displaystyle mathbf 1 m ye m vimirnim vektorom z usima odinicyami a 1 displaystyle cdot 1 ye ℓ 1 displaystyle ell 1 normoyu HSIC zavzhdi nabuvaye nevid yemnogo znachennya i ye nulem todi j lishe todi koli dvi vipadkovi zminni ye statistichno nezalezhnimi pri vikoristanni universalnogo vidtvoryuvalnogo yadra takogo yak gausove HSIC Lasso takozh mozhe buti zapisano yak H S I C L a s s o min x 1 2 L k 1 n x k K k F 2 l x 1 displaystyle mathrm HSIC Lasso min mathbf x frac 1 2 left bar mathbf L sum k 1 n x k bar mathbf K k right F 2 lambda mathbf x 1 quad t sh x 1 x n 0 displaystyle x 1 ldots x n geq 0 de F displaystyle cdot F ye normoyu Frobeniusa Cya zadacha optimizaciyi ye zadacheyu en i vidtak yiyi mozhlivo efektivno rozv yazuvati peredovim rozv yazuvachem LASSO takim yak podvijnij en Korelyacijne obirannya oznakMira korelyacijnogo obirannya oznak angl Correlation Feature Selection CFS ocinyuye pidmozhinu oznak na osnovi nastupnoyi gipotezi Dobri pidmnozhini oznak mistyat oznaki visoko korelovani z klasifikaciyeyu ale nekorelovani mizh soboyu Nastupne rivnyannya zadaye yakist pidmnozhini oznak S sho skladayetsya iz k oznak M e r i t S k k r c f k k k 1 r f f displaystyle mathrm Merit S k frac k overline r cf sqrt k k k 1 overline r ff Tut r c f displaystyle overline r cf ye userednenim znachennyam usih korelyacij oznaka klasifikaciya a r f f displaystyle overline r ff ye userednenim znachennyam usih korelyacij oznaka oznaka Kriterij CFS viznachayut nastupnim chinom C F S max S k r c f 1 r c f 2 r c f k k 2 r f 1 f 2 r f i f j r f k f 1 displaystyle mathrm CFS max S k left frac r cf 1 r cf 2 cdots r cf k sqrt k 2 r f 1 f 2 cdots r f i f j cdots r f k f 1 right Zminni r c f i displaystyle r cf i ta r f i f j displaystyle r f i f j nazivayut korelyaciyami ale voni ne obov yazkovo ye koeficiyentami korelyaciyi Pirsona abo r Spirmena Disertaciya doktora Marka Golla angl Mark Hall ne vikoristovuye zhodnogo z cih ale vikoristovuye tri vidminni miri pov yazanosti minimalnu dovzhinu opisu angl minimum description length MDL simetrichnu neviznachenist ta en Nehaj xi bude indikatornoyu funkciyeyu chlenstva oznaki fi todi navedene vishe mozhe buti perepisano yak zadachu optimizaciyi C F S max x 0 1 n i 1 n a i x i 2 i 1 n x i i j 2 b i j x i x j displaystyle mathrm CFS max x in 0 1 n left frac sum i 1 n a i x i 2 sum i 1 n x i sum i neq j 2b ij x i x j right Kombinatorni zadachi vishe ye faktichno zadachami zmishanogo dvijkovogo linijnogo programuvannya yaki mozhlivo rozv yazuvati zastosuvannyam algoritmiv gilok ta mezh Regulyarizovani derevaOznaki z dereva abo ansamblyu derev rishen viyavlyayutsya nadlishkovimi Dlya obirannya pidmnozhini oznak mozhlivo zastosovuvati neshodavnij metod nazvanij regulyarizovanim derevom Regulyarizovani dereva zdijsnyuyut shtrafuvannya vikoristovuyuchi dlya rozdilennya potochnogo vuzla zminnu podibnu do zminnih vibranih na poperednih vuzlah dereva Regulyarizovani dereva potrebuyut pobudovi lishe odniyeyi derevnoyi modeli abo odniyeyi modeli ansamblyu derev i otzhe ye obchislyuvalno efektivnimi Regulyarizovani dereva prirodno obroblyayut chislovi ta kategorijni oznaki vzayemodiyi ta nelinijnosti Voni ye invariantnimi vidnosno masshtabiv odinic vimiru atributiv ta nechutlivimi do vikidiv i takim chinom vimagayut neznachnoyi poperednoyi obrobki danih takoyi yak en Odnim iz tipiv regulyarizovanih derev ye regulyarizovanij vipadkovij lis RVL angl Regularized random forest RRF Cej kerovanij RVL ye rozshirenim RVL sho keruyetsya balami vazhlivosti zi zvichajnogo vipadkovogo lisu Oglyad metaevristichnih metodivMetaevristika ye zagalnim opisom algoritmu prisvyachenogo rozv yazannyu skladnih zazvichaj NP skladnih zadach optimizaciyi dlya yakih ne isnuye klasichnih metodiv rozv yazannya Zagalom metaevristika ye stohastichnim algoritmom yakij vede do dosyagnennya globalnogo optimumu Isnuye bagato metaevristik vid prostogo lokalnogo poshuku i do algoritmu skladnogo globalnogo poshuku Golovni principi Metodi obirannya oznak zazvichaj predstavlyayut troma klasami na osnovi togo yak voni poyednuyut algoritm obirannya ta pobudovu modeli Filtrovij metod Filtrovij metod obirannya oznak Metodi filtrovogo tipu obirayut zminni nezalezhno vid modeli Voni gruntuyutsya lishe na zagalnih oznakah takih yak korelyaciya z peredbachuvanoyu zminnoyu Filtrovi metodi pridushuyut najmensh cikavi zminni Inshi zminni budut chastinoyu modeli klasifikaciyi abo regresiyi yaku zastosovuyut dlya klasifikuvannya abo peredbachennya danih Ci metodi ye osoblivo efektivnimi z tochki zoru obchislyuvalnogo chasu ta stijkimi do perenavchannya Filtrovi metodi shilni obirati nadlishkovi zminni koli voni ne berut do uvagi vzayemovidnoshennya mizh zminnimi Prote bilsh produmani funkciyi taki yak algoritm FCBF namagayutsya j minimizuyut cyu problemu shlyahom usuvannya silno korelovanih mizh soboyu zminnih Obgortkovij metod Obgortkovij metod obirannya oznak Obgortkovi metodi ocinyuyut pidmnozhini zminnih sho dozvolyaye na vidminu vid filtrovih pidhodiv viyavlyati mozhlivi vzayemodiyi mizh zminnimi Dvoma golovnimi nedolikami cih metodiv ye Pidvishennya riziku perenavchannya za nedostatnoyi kilkosti sposterezhen Znachnij obchislyuvalnij chas za velikoyi kilkosti zminnih Vbudovanij metod Vbudovanij metod obirannya oznak Neshodavno bulo zaproponovano vbudovani metodi yaki namagayutsya poyednuvati perevagi oboh poperednih metodiv Algoritm navchannya otrimuye perevagu vid svogo vlasnogo procesu obirannya zminnih i vikonuye obirannya oznak ta klasifikuvannya odnochasno yak ce robit algoritm FRMT Zastosuvannya metaevristik obirannya oznak Ce oglyad zastosuvan metaevristik obirannya oznak sho ostannim chasom vikoristovuvali v literaturi Cej oglyad bulo realizovano Dzh Gemmon angl J Hammon v yiyi disertaciyi 2013 roku Zastosuvannya Algoritm Pidhid klasifikator Ocinna funkciya Posilannya ONP Obirannya oznak za shozhistyu Filtr r2 Phuong 2005 ONP Genetichnij algoritm Obgortka Derevo rishen Tochnist klasifikaciyi 10 kratna Shah 2004 ONP Pidjom shilom Filtr obgortka Nayivnij bayesiv Predicted residual sum of squares Long 2007 ONP Imitaciya vidpalu Nayivnij bayesiv Tochnist klasifikaciyi 5 kratna Ustunkar 2011 Movna segmentaciya Murashinij algoritm Obgortka Shtuchna nejronna merezha SKP Al ani 2005 dzherelo Marketing Imitaciya vidpalu Obgortka Regresiya IKA r2 Meiri 2006 Ekonomika Imitaciya vidpalu genetichnij algoritm Obgortka Regresiya BIK Kapetanios 2005 Spektralna masa Genetichnij algoritm Obgortka Mnozhinna linijna regresiya en en peredbachennya Broadhurst 2007 Spam Dvijkovij MRCh en Obgortka Derevo rishen Zvazheni vitrati Zhang 2014 Mikrochipi Tabu poshuk MRCh Obgortka Metod opornih vektoriv k najblizhchih susidiv Evklidova vidstan Chuang 2009 Mikrochipi MRCh genetichnij algoritm Obgortka Metod opornih vektoriv Tochnist klasifikaciyi 10 kratna Alba 2007 Mikrochipi Genetichnij algoritm en Vbudovanij Metod opornih vektoriv Tochnist klasifikaciyi 10 kratna Duval 2009 Mikrochipi Iterativnij lokalnij poshuk Obgortka Regresiya Aposteriorna jmovirnist Hans 2007 Mikrochipi Genetichnij algoritm Obgortka k najblizhchih susidiv Tochnist klasifikaciyi perehresne zatverdzhuvannya z viklyuchennyam po odnomu Jirapech Umpai 2005 Mikrochipi en Obgortka k najblizhchih susidiv Tochnist klasifikaciyi perehresne zatverdzhuvannya z viklyuchennyam po odnomu Oh 2004 Mikrochipi Genetichnij algoritm Obgortka Metod opornih vektoriv Chutlivist i specifichnist Xuan 2011 Mikrochipi Genetichnij algoritm Obgortka Povnosparovanij metod opornih vektoriv Tochnist klasifikaciyi perehresne zatverdzhuvannya z viklyuchennyam po odnomu Peng 2003 Mikrochipi Genetichnij algoritm Vbudovanij Metod opornih vektoriv Tochnist klasifikaciyi 10 kratna Hernandez 2007 Mikrochipi Genetichnij algoritm Gibridnij Metod opornih vektoriv Tochnist klasifikaciyi perehresne zatverdzhuvannya z viklyuchennyam po odnomu Huerta 2006 Mikrochipi Genetichnij algoritm Metod opornih vektoriv Tochnist klasifikaciyi 10 kratna Muni 2006 Mikrochipi Genetichnij algoritm Obgortka Metod opornih vektoriv EH DIALL CLUMP Jourdan 2004 Hvoroba Alcgejmera en Filtr Yadrovij metod opornih vektoriv Tochnist klasifikaciyi 10 kratna Zhang 2015 Komp yuterne bachennya Neskinchenne obirannya oznak Filtr Nezalezhnij en PPK RHP Roffo 2015 Mikrochipi VO centralnistyu vlasnogo vektora Filtr Nezalezhnij Userednena chitkist Tochnist PPK RHP Roffo amp Melzi 2016 XML Simetrichna Tau angl ST Filtr Strukturna asociativna klasifikaciya Tochnist Pokrittya Shaharanee amp Hadzic 2014Obirannya oznak vbudovane do algoritmiv navchannyaDeyaki algoritmi navchannya vikonuyut obirannya oznak yak chastinu svoyeyi zagalnoyi roboti Do nih nalezhat Metodiki l 1 displaystyle l 1 regulyarizaciyi taki yak rozridzhena regresiya en ta l 1 displaystyle l 1 MOV Regulyarizovani dereva napriklad regulyarizovanij vipadkovij lis vtilenij u paketi RRF Dereva rishen en en angl Random multinomial logit RMNL Avtokoduvalni merezhi z rivnem vuzkim miscem en obirannya oznak Obirannya oznak na osnovi lokalnogo navchannya V porivnyanni z tradicijnimi metodami vin ne peredbachaye zhodnogo evristichnogo poshuku mozhe legko vporuvatisya z poliklasovimi zadachami i pracyuye yak dlya linijnih tak i dlya nelinijnih zadach Vin takozh pidtrimuyetsya silnim teoretichnim pidgruntyam Chiselni eksperimenti pokazali sho cej metod mozhe dosyagati blizkogo do optimalnogo rozv yazku navit koli dani mistyat ponad miljon nedorechnih oznak Div takozhKlasternij analiz Dobuvannya danih Znizhennya rozmirnosti Vidilyannya oznak Optimizaciya giperparametriv en PrimitkiGareth James Daniela Witten Trevor Hastie Robert Tibshirani 2013 Springer s 204 Arhiv originalu za 23 chervnya 2019 Procitovano 20 sichnya 2016 angl Bermingham Mairead L Pong Wong Ricardo Spiliopoulou Athina Hayward Caroline Rudan Igor Campbell Harry Wright Alan F Wilson James F Agakov Felix Navarro Pau Haley Chris S 2015 Application of high dimensional feature selection evaluation for genomic prediction in man Scientific Reports Sci Rep 5 10312 Bibcode 2015NatSR 510312B doi 10 1038 srep10312 PMC 4437376 PMID 25988841 angl Guyon Isabelle Elisseeff Andre 2003 An Introduction to Variable and Feature Selection JMLR 3 angl Yang Yiming Pedersen Jan O 1997 A comparative study on feature selection in text categorization PDF ICML angl Urbanowicz Ryan J Meeker Melissa LaCava William Olson Randal S Moore Jason H 22 listopada 2017 Relief Based Feature Selection Introduction and Review arXiv 1711 08421 cs DS angl Forman George 2003 An extensive empirical study of feature selection metrics for text classification PDF Journal of Machine Learning Research 3 1289 1305 angl Yishi Zhang Shujuan Li Teng Wang Zigang Zhang 2013 Divergence based feature selection for separate classes Neurocomputing 101 4 32 42 doi 10 1016 j neucom 2012 06 036 angl Guyon I Weston J Barnhill S Vapnik V 2002 Gene selection for cancer classification using support vector machines Machine Learning 46 1 3 389 422 doi 10 1023 A 1012487302797 angl Bach Francis R 2008 Bolasso model consistent lasso estimation through the bootstrap s 33 40 doi 10 1145 1390156 1390161 ISBN 9781605582054 a href wiki D0 A8 D0 B0 D0 B1 D0 BB D0 BE D0 BD Cite book title Shablon Cite book cite book a Proignorovano journal dovidka angl Zare Habil 2013 Scoring relevancy of features based on combinatorial analysis of Lasso with application to lymphoma diagnosis BMC Genomics 14 S14 doi 10 1186 1471 2164 14 S1 S14 PMC 3549810 PMID 23369194 a href wiki D0 A8 D0 B0 D0 B1 D0 BB D0 BE D0 BD Cite journal title Shablon Cite journal cite journal a Obslugovuvannya CS1 Storinki iz nepoznachenim DOI z bezkoshtovnim dostupom posilannya angl Soufan Othman Kleftogiannis Dimitrios Kalnis Panos Bajic Vladimir B 26 lyutogo 2015 DWFS A Wrapper Feature Selection Tool Based on a Parallel Genetic Algorithm PLOS One angl 10 2 e0117988 Bibcode 2015PLoSO 1017988S doi 10 1371 journal pone 0117988 ISSN 1932 6203 PMC 4342225 PMID 25719748 a href wiki D0 A8 D0 B0 D0 B1 D0 BB D0 BE D0 BD Cite journal title Shablon Cite journal cite journal a Obslugovuvannya CS1 Storinki iz nepoznachenim DOI z bezkoshtovnim dostupom posilannya angl Figueroa Alejandro 2015 Exploring effective features for recognizing the user intent behind web queries Computers in Industry 68 162 169 doi 10 1016 j compind 2015 01 005 angl Figueroa Alejandro Guenter Neumann 2013 Learning to Rank Effective Paraphrases from Query Logs for Community Question Answering AAAI angl Figueroa Alejandro Guenter Neumann 2014 Category specific models for ranking effective paraphrases in community Question Answering Expert Systems with Applications 41 10 4730 4742 doi 10 1016 j eswa 2014 02 004 angl Zhang Y Wang S Phillips P 2014 Binary PSO with Mutation Operator for Feature Selection using Decision Tree applied to Spam Detection Knowledge Based Systems 64 22 31 doi 10 1016 j knosys 2014 03 015 angl F C Garcia Lopez M Garcia Torres B Melian J A Moreno Perez J M Moreno Vega Solving feature subset selection problem by a Parallel Scatter Search European Journal of Operational Research vol 169 no 2 pp 477 489 2006 angl F C Garcia Lopez M Garcia Torres B Melian J A Moreno Perez J M Moreno Vega Solving Feature Subset Selection Problem by a Hybrid Metaheuristic 30 serpnya 2019 u Wayback Machine In First International Workshop on Hybrid Metaheuristics pp 59 68 2004 angl M Garcia Torres F Gomez Vela B Melian J M Moreno Vega High dimensional feature selection via feature grouping A Variable Neighborhood Search approach Information Sciences vol 326 pp 102 118 2016 angl Kraskov Alexander Stogbauer Harald Andrzejak Ralph G Grassberger Peter 2003 Hierarchical Clustering Based on Mutual Information arXiv q bio 0311039 Bibcode 2003q bio 11039K angl 1985 Prediction and entropy u Atkinson A C red PDF Springer s 1 24 arhiv originalu PDF za 30 serpnya 2019 procitovano 8 veresnya 2019 angl Burnham K P Anderson D R 2002 Model Selection and Multimodel Inference A practical information theoretic approach vid 2nd Springer Verlag angl Einicke G A 2018 Maximum Entropy Rate Selection of Features for Classifying Changes in Knee and Ankle Dynamics During Running IEEE Journal of Biomedical and Health Informatics 28 4 1097 1103 doi 10 1109 JBHI 2017 2711487 PMID 29969403 angl Aliferis Constantin 2010 Local causal and markov blanket induction for causal discovery and feature selection for classification part I Algorithms and empirical evaluation PDF Journal of Machine Learning Research 11 171 234 angl Brown Gavin Pocock Adam Zhao Ming Jie Lujan Mikel 2012 Conditional Likelihood Maximisation A Unifying Framework for Information Theoretic Feature Selection Journal of Machine Learning Research 13 27 66 1 angl Peng H C Long F Ding C 2005 Feature selection based on mutual information criteria of max dependency max relevance and min redundancy en 27 8 1226 1238 CiteSeerX 10 1 1 63 5765 doi 10 1109 TPAMI 2005 159 PMID 16119262 angl Programa Nguyen H Franke K Petrovic S 2010 Towards a Generic Feature Selection Measure for Intrusion Detection In Proc International Conference on Pattern Recognition ICPR Istanbul Turkey 2 angl Rodriguez Lujan I Huerta R Elkan C Santa Cruz C 2010 Quadratic programming feature selection PDF JMLR 11 1491 1516 angl Nguyen X Vinh Jeffrey Chan Simone Romano and James Bailey Effective Global Approaches for Mutual Information based Feature Selection Proceedings of the 20th ACM SIGKDD Conference on Knowledge Discovery and Data Mining KDD 14 August 24 27 New York City 2014 3 angl Yang Howard Hua Moody John 2000 Data visualization and feature selection New algorithms for nongaussian data PDF Advances in Neural Information Processing Systems 687 693 angl M Yamada W Jitkrittum L Sigal E P Xing M Sugiyama High Dimensional Feature Selection by Feature Wise Non Linear Lasso Neural Computation vol 26 no 1 pp 185 207 2014 angl M Hall 1999 Correlation based Feature Selection for Machine Learning angl Senliol Baris et al Fast Correlation Based Filter FCBF with a different search strategy Computer and Information Sciences 2008 ISCIS 08 23rd International Symposium on IEEE 2008 4 angl Hai Nguyen Katrin Franke and Slobodan Petrovic Optimizing a class of feature selection measures Proceedings of the NIPS 2009 Workshop on Discrete Optimization in Machine Learning Submodularity Sparsity amp Polyhedra DISCML Vancouver Canada December 2009 5 angl H Deng G Runger Feature Selection via Regularized Trees 2015 09 12 u Wayback Machine Proceedings of the 2012 International Joint Conference on Neural Networks IJCNN IEEE 2012 angl RRF Regularized Random Forest paket R na CRAN angl J Hammon Optimisation combinatoire pour la selection de variables en regression en grande dimension Application en genetique animale November 2013 fr PDF Arhiv originalu PDF za 12 lipnya 2018 angl T M Phuong Z Lin et R B Altman Choosing SNPs using feature selection 13 veresnya 2016 u Wayback Machine Proceedings IEEE Computational Systems Bioinformatics Conference CSB IEEE Computational Systems Bioinformatics Conference pages 301 309 2005 PMID 16447987 angl Saghapour E Kermani S Sehhati M 2017 A novel feature ranking method for prediction of cancer stages using proteomics data PLOS ONE 12 9 e0184203 https doi org 10 1371 journal pone 0184203 angl Shah S C Kusiak A 2004 Data mining and genetic algorithm based gene SNP selection Artificial Intelligence in Medicine 31 3 183 196 doi 10 1016 j artmed 2004 04 002 PMID 15302085 angl Long N Gianola D Weigel K A 2011 Dimension reduction and variable selection for genomic selection application to predicting milk yield in Holsteins Journal of Animal Breeding and Genetics 128 4 247 257 doi 10 1111 j 1439 0388 2011 00917 x PMID 21749471 angl G Ustunkar S Ozogur Akyuz G W Weber C M Friedrich et Yesim Aydin Son Selection of representative SNP sets for genome wide association studies a metaheuristic approach Optimization Letters November 2011 angl R Meiri et J Zahavi Using simulated annealing to optimize the feature selection problem in marketing applications European Journal of Operational Research vol 171 no 3 pages 842 858 Juin 2006 angl G Kapetanios Variable Selection using Non Standard Optimisation of Information Criteria Working Paper 533 Queen Mary University of London School of Economics and Finance 2005 angl D Broadhurst R Goodacre A Jones J J Rowland et D B Kell Genetic algorithms as a method for variable selection in multiple linear regression and partial least squares regression with applications to pyrolysis mass spectrometry Analytica Chimica Acta vol 348 no 1 3 pages 71 86 August 1997 angl Chuang L Y Yang C H 2009 Tabu search and binary particle swarm optimization for feature selection using microarray data Journal of Computational Biology 16 12 1689 1703 doi 10 1089 cmb 2007 0211 PMID 20047491 angl E Alba J Garia Nieto L Jourdan et E G Talbi Gene Selection in Cancer Classification using PSO SVM and GA SVM Hybrid Algorithms 2016 08 18 u Wayback Machine Congress on Evolutionary Computation Singapor Singapore 2007 2007 angl B Duval J K Hao et J C Hernandez Hernandez A memetic algorithm for gene selection and molecular classification of an cancer In Proceedings of the 11th Annual conference on Genetic and evolutionary computation GECCO 09 pages 201 208 New York NY USA 2009 ACM angl C Hans A Dobra et M West Shotgun stochastic search for large p regression Journal of the American Statistical Association 2007 angl Aitken S 2005 Feature selection and classification for microarray data analysis Evolutionary methods for identifying predictive genes BMC Bioinformatics 6 1 148 doi 10 1186 1471 2105 6 148 PMC 1181625 PMID 15958165 a href wiki D0 A8 D0 B0 D0 B1 D0 BB D0 BE D0 BD Cite journal title Shablon Cite journal cite journal a Obslugovuvannya CS1 Storinki iz nepoznachenim DOI z bezkoshtovnim dostupom posilannya angl Oh I S Moon B R 2004 Hybrid genetic algorithms for feature selection en 26 11 1424 1437 CiteSeerX 10 1 1 467 4179 doi 10 1109 tpami 2004 105 PMID 15521491 angl Xuan P Guo M Z Wang J Liu X Y Liu Y 2011 Genetic algorithm based efficient feature selection for classification of pre miRNAs Genetics and Molecular Research 10 2 588 603 doi 10 4238 vol10 2gmr969 PMID 21491369 angl Peng S 2003 Molecular classification of cancer types from microarray data using the combination of genetic algorithms and support vector machines FEBS Letters 555 2 358 362 doi 10 1016 s0014 5793 03 01275 4 angl J C H Hernandez B Duval et J K Hao A genetic embedded approach for gene selection and classification of microarray data In Proceedings of the 5th European conference on Evolutionary computation machine learning and data mining in bioinformatics EvoBIO 07 pages 90 101 Berlin Heidelberg 2007 SpringerVerlag angl E B Huerta B Duval et J K Hao A hybrid GA SVM approach for gene selection and classification of microarray data evoworkshops 2006 LNCS vol 3907 pages 34 44 2006 angl D P Muni N R Pal et J Das Genetic programming for simultaneous feature selection and classifier design IEEE Transactions on Systems Man and Cybernetics Part B Cybernetics Cybernetics vol 36 no 1 pages 106 117 February 2006 angl L Jourdan C Dhaenens et E G Talbi Linkage disequilibrium study with a parallel adaptive GA International Journal of Foundations of Computer Science 2004 angl Zhang Y Dong Z Phillips P Wang S 2015 Detection of subjects and brain regions related to Alzheimer s disease using 3D MRI scans based on eigenbrain and machine learning Frontiers in Computational Neuroscience 9 66 doi 10 3389 fncom 2015 00066 PMC 4451357 PMID 26082713 a href wiki D0 A8 D0 B0 D0 B1 D0 BB D0 BE D0 BD Cite journal title Shablon Cite journal cite journal a Obslugovuvannya CS1 Storinki iz nepoznachenim DOI z bezkoshtovnim dostupom posilannya angl Roffo G Melzi S Cristani M 1 grudnya 2015 Infinite Feature Selection s 4202 4210 doi 10 1109 ICCV 2015 478 ISBN 978 1 4673 8391 2 a href wiki D0 A8 D0 B0 D0 B1 D0 BB D0 BE D0 BD Cite book title Shablon Cite book cite book a Proignorovano journal dovidka angl Roffo Giorgio Melzi Simone September 2016 Features Selection via Eigenvector Centrality PDF NFmcp2016 Procitovano 12 listopada 2016 angl R Kohavi and G John Wrappers for feature subset selection en 97 1 2 1997 273 324 angl Das Abhimanyu Kempe David 2011 Submodular meets Spectral Greedy Algorithms for Subset Selection Sparse Approximation and Dictionary Selection arXiv 1102 3975 stat ML angl Liu et al Submodular feature selection for high dimensional acoustic score spaces 17 zhovtnya 2015 u Wayback Machine angl Zheng et al Submodular Attribute Selection for Action Recognition in Video 18 listopada 2015 u Wayback Machine angl Y Sun S Todorovic S Goodison 2010 Local Learning Based Feature Selection for High Dimensional Data Analysis en 32 9 1610 1626 angl LiteraturaHarrell F 2001 Regression Modeling Strategies en angl Liu Huan Motoda Hiroshi 1998 Feature Selection for Knowledge Discovery and Data Mining Springer angl Special Issue on Variable and Feature Selection Journal of Machine Learning Research 2003 angl An Introduction to Variable and Feature Selection oglyad angl Toward integrating feature selection algorithms for classification and clustering oglyad angl Efficient Feature Subset Selection and Subset Size Optimization oglyad 2010 roku angl Searching for Interacting Features Zhao amp Liu predstavlena na IJCAI 2007 angl PosilannyaZmagannya NIPS 2003 roku div takozh en Realizaciya nayivnogo bayesovogo klasifikatora iz obirannyam oznak na Visual Basic vklyuchaye vikonuvanij ta dzherelnij kod Programa obirannya oznak minimalnoyi nadlishkovosti maksimalnoyi dorechnosti minimum redundancy maximum relevance mRMR FEAST vidkriti algoritmi obirannya oznak na C ta MATLAB Automated feature selection with boruta kaggle com angl Procitovano 4 bereznya 2023
Топ