research article
Auteurs:Elif Kartal,,Fatma Caliskan,,Witte Maagd EskişehirliInZeki Özen
Part 578, edition C
Published:2. July 2024 Publication history
- 0quote
- 0
- Downloads
Statistics
Collected quotes0Totale downloads0Last 12 months0
Last 6 weeks0
New quote alarm added!
This alarm is successfully added and is sent to:
Every time you will be informed of a record that you have chosen is quoted.
Click on the button below to manage your alarm preferences.
Manage my alarms
New quote alarm!
PleaseLog in to your account
- See Settings
- References
- Media
- Tables
- Of the
Abstract
The K-Nearste Buur (K-NN) is a well-known controlled learning algorithm.In the field of rational numbers, Q, which is the usual absolute value and the P-adic absolute value for a primary p.in that regards this statement, the P-Adic motivates absolute value for us to p-adic distance between twoCalculate samples for K -NN algorithm.In this study, P-ADIC distance was linked to the K-NN algorithm and was used for 10 well-known public data sets that are categorically, numerically and mixed (both categorically) predictable attributes., 3 The effect of the R-Decimal values for the number of P-Idical calculation was investigated for numerical and mixed data sets.P was very close together, especially in categorical and mixed data sets.
References
[1]
Harrington P.,Machine Learning in Action, First ed., Manning Publications Co., Shelter Island, NY, 2012.
[2]
Cunningham P., Delany S.J., K-Hearest Nabo Classifiers-A Tutorial,ACM Comput.Overlev.54 (6) (2021) 128: 1–128: 25.
[3]
Han J., Kamber M.,Datamining: concepts and techniques, in ed., Morgan Kaufmann Publisher, San Francisco, CA, VS, 2006.
[4]
Steinbach M., Tan P.-N., K-NN: K-NEARST BUURS, I: WU X., Kumar V. (Eds.),The Ten Best Algorithms In Data Mining, Chapman en Hall/CRC, 2009.
[5]
I. Arora, N. Khanduja, M. Bansal, Effect of distance-semetric and function scaling on K-NN algorithm during the classification of X-rays, in: CEUR workshop Proceedings, Vol.3176, Constantine, Algeria, 2021.
[6]
Ehsani R., Drabel F., Robust distance measurements for K-NN classification of cancer data,Informs cancer.19 (2020) 1–9,.
[7]
Abu Alfeilat H.A., Hassanat A.B., Lasassmeh O., Tarawneh A.S., Alhasanat M.B., Eyal Salman H.S., Prasath V.S., Effects of distance Survey on choices of distance research into the classification performance of K-Naarste Buur: a review,,, review,Review, Review, Review, Review, Review, Review, Review, Review, Review,Big Data7 (4) (2019) 221–248,.
[8]
Bradley PE, about P-ADIC classification,P-adic numb.ultram.analyse.Appl.1 (2009) 271–285,.
[9]
Çalışkan F., some Lacuny Power series and Mahlers U M-Numbers in P-Adic Domein,Proc.bulg.acad.sci.75 (4) (2022) 477–485,.
[10]
Alfare F., Rams for P-ADIC-NUNRENE,APP -Topology.273 (2020) 106977,.
[11]
Faisant A., P-ADIC Golden Section,P-adic numb.ultram.analyse.Appl.6 (2014) 284–292,.
[12]
Dragovich B., About Measurements, Numbers and P-Adic Mathematical Physics,P-adic numb.ultram.analyse.Appl.4 (2012) 102-108,.
[13]
HUA H., Hovestadt L., P-ADIC-NUMBERS codes for complex networks,Sci.Rep.11 (1) (2021).
[14]
Zambrano-Luna B.a., Zúñiga-Galindo W.A., P-ADIC Celllaire Neurale Netwerken,J. Not -Linær Mathematics.Phys.(2022).
[15]
Khennikov A., Kotovich N., Image segmentation using P-ADIC meter,New trends adv.Methods interdiscip.matematics.sci.(2017) 143–154.
[16]
Muragh F., Downs G., Contreras P., Hierarchical cluster of solid, high dimensional data sets by using ultrametric embedding,SIAM J.I.PEEUTER.(30) 2 (2008) 707–730.
[17]
Dua D., Gray C.,UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences, 2017, URLhttp://archive.ics.uci.edu/ml.
[18]
Cha S., extensive research into distance/equality examination between probability density functionsBibsonomy,Int.J.Math.Modeller meth.appl.sci.1 (4) (2007) 300–307.
[19]
Ali N., Neagu D., Trundle P., Evaluation of K-Hemelst adjacent classification performance for heterogeneous data set,Sn Appl.sci.1 (12) (2019) 1-15,.
[20]
Singh A., Hongamuge M.N., Lakshmiganthan R., Effect of different data types on the performance of the classificator of random forest, Naiv Bayes and K-Nest Neighbours Algorithms,Int.J.Adv.computer.sci.appl. (IJACSA)8 (12) (2017) 1–10,.
[21]
K. Chomboon, P. Chujai, P. Teerarassamee, K. Kerdprasop, N. Kerdprasop, an empirical study of distance measurements for K-Nest-Buuralgorithm, in: Course of the 3rd International Conference on Industrial Application Technology, 2015, pp.280–285.
[22]
Batista G., Silva D.F., How K-Lestly adjacent parameters influence his performance, in:Argentinian symposium about artificial intelligence, Citseer, Mar del Plata, Argentina, 2009, pp.1–12.
[23]
Gou J., du L., Zhang Y., Xiong T., a new Spacer-weighted K-Hemelste neighbor classificator,J. Inf.computer.science.9 (6) (2011) 1429–1436.
[24]
Carabulut B., Arslan G., Ünver H.M., A weighted equality measures for K-lestian neighboring rhythm,Celal Bayar Univ.J.Sci.15 (4) (2019) 393–400,.
[25]
Rodrigues é.o., combination of Minkowski and CheBysshev: new distance proposal and research of distance pets using the classificator of the K-Hemel,Pattern in general.110 (2018) 66–71,.
[26]
Mehta S., Shen X., Gou J., Niu D., a new nearest classificator of the Centroid Buurman based on K local resources using harmonic average,Information9 (9) (2018) 1–16,.
[27]
Kartal E., Koçoğlu F.Ö., Özen Z., Emre İ.E., Güngör G., Bozkurt P., Castigiet-Postoperative Kronisk Smerte Smerte Forudsystem (I-POCPP),J. Istanbul Fac.Med.85 (3) (2022) 416–424,.
[28]
Yılmaz ren, Kartal E., Özen Z., Gülseçen S., Prediction of fuel tank in the aviation industry with machine learning -algorithms,J. Aeronaut.Space Technol.14 (1) (2021) 19–34.
[29]
Kartal E.,Techniques for machine learning based on classification and a study of cardiac risk assessment, (Ph.D. -afhandling) Istanbul University, Institute of Science, 2015.
[30]
Deza M.M., Deza E.,Encyclopedia of distances, Springer, Berlin, Heidelberg, 2016.
[31]
Bachman G.,Introduction to P-ADIC Numbers and appreciation theory, I: Akademiske Paperbacks.Matematik, Academic Press, New York, 1964.
[32]
Gouvea F.Q.,P-UDIC numbers: an introduction, Second edition, Springer, Berlin, 2003.
[33]
Almomany A., Ayyad W.R., Jarrah A., optimized implementation of an improved K-NN-classification algorithm using Intel FPGA platform: COVID-19 Case Study,J. King Saud Univ.Co.puter.inf.sci.34 (6, van B) (2022) 3815–3827,.
[34]
Harris CR, Millman KJ, van der Walt SJ, Gommers R., Virtanen P., Cournapeau D., Wieser E., Taylor J., Berg S., Smith NJ, Kern R., Picus M., Hoyer S., van Kerkwijk M.H., Brett M., Haldane A., del Río J.F., Wiebe M., Peterson P., Gérard-Marchant P., Sheppard K., Reddy T., Weckesser W., Abbasi H., Gohlke C., Oliphant T.E., Array programming with NumPy,Nature585 (7825) (2020) 357–362,.
[35]
Pandas Development Team C.R.,Pandas-Dev/Pandas: PandasZenodo, 2020,.
[36]
McKinney W., Data Structures for Statistical Computing in Python, In: Stéfan van der Walt X., Jarrod Millman V. (ed.),Cursus van de 9e Python in Science Conference, 2010, s.56–61,.
[37]
Pedregosa F., Varoquaux G., Gramfort A., Michel V., Thirion B., Grisel O., Blondel M., Prettenhofer P., Weiss R., Dubourg V., Vanderplas J., Passos A., Cournapeau D ., Brucher M., Perrot M., duch*esnay E., Scikit-learn: Machine learning in Python,J. Mach.Leach.res.12 (2011) 2825–2830.
Recommendations
- Nearest neighbor distance matrixclassification
Adma'10: Course of the 6th International Conference on Advanced Data Mining and Applications: Share In
A distance-based classification is one of the popular classification methods of authorities using a point-to-point distance based on the nearest neighbor ork-Nesty Neighbor (k-Nn).
read more
- Local remote classification
In this article we have introduced a new method in which each training point learns what is happening in its neighborhood.
read more
- Improvement of the nearest neighbor classification with CAM -weighted distance
The classification of the nearest neighbor (NN) presupposes local constant class -related probabilities and suffers from bias in high dimensions with a small sample set.
read more
Comments
Information and contributors
Information
Published in
NeurocomputingPart 578, edition C
April 2024
254 Sider
ISSN:0925-2312
The table of contents of the question
Elsevier B.V.
Publisher
Elsevier Science Publishers B. V.
Holland
Publication history
Published: 2. July 2024
Brands
- classification
- Metrop
- K-NN
- P-adic-aftand
- Machine Learning
Qualifications
- research article
Staff
Other measurements
See article measurements
Bibliometria a quotes
Bibliometry
Aid measurements
Collected quotes
Totale downloads
- Downloads (last 12 months)0
- Downloads (last 6 weeks)0
Other measurements
See author measurements
Quotes
See Settings
See Settings
To gain access
Loginindstillinger
Check whether you have access via your login details or your setting to get full access to this article.
Let in
Full access
Receive this publication
Media
Turn
Other things
Tables