![]() |
Arman Zharmagambetov
Postdoctoral ResearcherMeta AI (FAIR) Menlo Park, CA, USA Email: armanz [at] meta [dot] com azharmagambetov [at] ucmerced [dot] edu Social: [LinkedIn] [Threads] |
I am a postdoctoral researcher at Meta AI (FAIR), working with Yuandong Tian in the intersection of machine learning and optimization, recently involving AI-guided optimization, reinforcement learning and combinatorial optimization. Prior to that, I earned my Ph.D. degree at University of California, Merced (UCM) advised by Miguel Á. Carreira-Perpiñán, where I primarily studied learning algorithms for decision trees and tree-based models (see TAO).
Google Scholar and CV
A. Zharmagambetov, B. Amos, A. Ferber, T. Huang, B. Dilkina, and Y. Tian (2023): "Landscape Surrogate: Learning Decision Losses for Mathematical Optimization Under Partial Information".
[arXiv]
[code]
[NeurIPS] A. Zharmagambetov and M. Á. Carreira-Perpiñán (2022): "Semi-Supervised Learning with Decision Trees: Graph Laplacian Tree Alternating Optimization".
Advances in Neural Information Processing Systems (NeurIPS 2022).
[external link]
[paper preprint]
[short video]
[poster]
[EMNLP] A. Zharmagambetov and M. Gabidolla and M. Á. Carreira-Perpiñán (2021): "Softmax Tree: An Accurate, Fast Classifier When the Number of Classes Is Large".
Conference on Empirical Methods in Natural Language Processing (EMNLP 2021, long paper track).
[external link]
[paper preprint]
[slides]
[poster]
[video]
[ICML] A. Zharmagambetov and M. Á. Carreira-Perpiñán (2020): "Smaller, More Accurate Regression Forests Using Tree Alternating Optimization".
International Conference on Machine Learning (ICML 2020), Jul. 13, 2020.
[external link]
[paper preprint]
[supplementary material]
[slides]
[video]
Learning Tree-Based Models with Manifold Regularization: Alternating Optimization Algorithms.
University of California, Merced, USA, 2022.
[external link]
[paper]
[slides]
[CVPR] M. Gabidolla, M. Á. Carreira-Perpiñán, A. Zharmagambetov: "Towards better decision forests: Forest Alternating Optimization".
IEEE Conf. Computer Vision and Pattern Recognition, 2023, to appear.
[external link]
[paper preprint]
[supplementary material]
[animations]
[DMKD] S. S. Hada, M. Á. Carreira-Perpiñán, A. Zharmagambetov: "Sparse oblique decision trees: a tool to understand and manipulate neural net features".
Data Mining and Knowledge Discovery, 2023.
Many of the figures in the publisher's version are badly messed up, with wrong labels. Paper preprint has the correct figures.
[external link]
[paper preprint]
[AISTATS] A. Zharmagambetov and M. Á. Carreira-Perpiñán (2022): "Learning Interpretable, Tree-Based Projection Mappings for Nonlinear Embeddings".
International Conference on Artificial Intelligence and Statistics (AISTATS 2022).
[external link]
[paper preprint]
[supplementary material]
[slides]
[poster]
[ICASSP] A. Zharmagambetov, Q. Tang , C.-C. Kao, Q. Zhang, M. Sun, V. Rozgic, J. Droppo, C. Wang (2022): "Improved Representation Learning for Acoustic Event Classification Using Tree-structured Ontology".
IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP 2022), May 7, 2022.
[external link]
[paper preprint]
[slides]
[poster]
[ICASSP] A. Zharmagambetov and M. Á. Carreira-Perpiñán (2021): "Learning a Tree of Neural Nets".
IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP 2021), Jun. 6, 2021.
[external link]
[paper preprint]
[poster]
[slides]
[IJCNN] A. Zharmagambetov and S. S. Hada and M. Gabidolla and M. Á. Carreira-Perpiñán (2021): "Non-Greedy Algorithms for Decision Tree Optimization: An Experimental Comparison".
International Joint Conference on Neural Networks (IJCNN 2021), Jul. 18, 2021.
[external link]
[paper preprint]
[arXiv version]
[IJCNN] A. Zharmagambetov and M. Gabidolla and M. Á. Carreira-Perpiñán (2021): "Improved Boosted Regression Forests Through Non-Greedy Tree Optimization".
International Joint Conference on Neural Networks (IJCNN 2021), Jul. 18, 2021.
[external link]
[paper preprint]
[ICIP] A. Zharmagambetov and M. Gabidolla and M. Á. Carreira-Perpiñán (2021): "Improved Multiclass AdaBoost for Image Classification:
the Role of Tree Optimization".
IEEE International Conference on Image Processing (ICIP 2021), Sep. 19, 2021.
[external link]
[paper preprint]
[ICIP] A. Zharmagambetov and M. Á. Carreira-Perpiñán (2021): "A Simple, Effective Way to Improve Neural Net Classification:
Ensembling Unit Activations with a Sparse Oblique Decision Tree".
IEEE International Conference on Image Processing (ICIP 2021), Sep. 19, 2021.
[external link]
[paper preprint]
[FODS] M. Á. Carreira-Perpiñán and A. Zharmagambetov (2020): "Ensembles of Bagged TAO Trees Consistently Improve over Random Forests, AdaBoost and Gradient Boosting".
ACM-IMS Foundations of Data Science Conference (FODS 2020), Oct. 19, 2020
[external link]
[paper preprint]
[video]
[BayLearn] M. Gabidolla and A. Zharmagambetov and M. Á. Carreira-Perpiñán (2020): "Boosted Sparse Oblique Decision Trees".
Bay Area Machine Learning Symposium (BayLearn 2020), Oct. 15, 2020
[external link]
[paper preprint]
[BayLearn] M. Á. Carreira-Perpiñán and A. Zharmagambetov (2018): "Fast Model Compression".
Bay Area Machine Learning Symposium (BayLearn 2018), Oct. 11, 2018.
[external link]
[paper preprint]
[poster]
S. Narynov and A. Zharmagambetov (2016): "On One Approach of Solving Sentiment Analysis Task for Kazakh and Russian Languages Using Deep Learning".
Int. Conf. on Computational Collective Intelligence (ICCCI), Sep 2016.
[external link]
[paper preprint]
A. Zharmagambetov and A. A. Pak. (2015): "Sentiment Analysis of a Document using Deep Learning Approach and Decision Trees".
IEEE 12th International Conference on Electronics Computer and Computation, Almaty, Kazakhstan, 2015.
[external link]
A. A. Pak, S. Narynov, A. Zharmagambetov, Sh. Sagyndykova, Zh. Kenzhebayeva. (2015): "The Method of Synonyms Extraction from Unannotated Corpus".
IEEE Int. Conf. on Digital Information, Networking, and Wireless Communications (DINWC), Feb 2015.
[external link]
[paper preprint]