I’m an MS Computer Science student at the George Washington University, and a visiting student at New York University working with Prof. Duygu Ataman. I have co-founded aLLMA Lab to promote NLP research for Azerbaijani language. My current research interests include multilingual language modeling and adversarial attacks on LLMs. So far, I have had a chance to work on the following problems:
- Neural spelling correction for agglutinative languages.
- Building web-scale text corpus and monolingual LLMs for Azerbaijani.
- Developing contex-sensitive retrieval-augmented generation.
- Benchmarking SOTA LLMs’ NLU capabilities on Turkic languages.
- Prompt compression via adversarial attack algorithms.
Before joining NYU, I was a lead machine learning engineer at PRODATA LLC, where I lead multiple industry projects on customer chatbots. Before that, I was a machine learning engineer at Azerbaijan AI Lab (within Azercosmos), where I developed spelling correction software for internal use of certain government agencies. I completed my bachelor’s degree on Biomedical Engineering at Azerbaijan State Oil and Industry University.
News
Feb 2025 | Our paper, TUMLU, is now available on arXiv. |
Dec 2024 | I was a reviewer for MRL Workshop at EMNLP 2024 in Miami, Florida. |
Aug 2024 | Joined NYU as a visiting student to work with Prof. Duygu Ataman. |
Aug 2024 | Our work on Azerbaijani foundation models received Best Paper Award (Honorable Mention) at SIGTURK Workshop/ACL 2024. |
Jul 2024 | Our work on contextual document retrieval was accepted to IEEE AICT 2024. |
Jun 2024 | Our work on Azerbaijani foundation models was accepted to SIGTURK Workshop at ACL 2024. |