I’m a Ph.D. student under the supervision of Prof. Eran Yahav. My research focuses on source code representations for machine learning models. We research machine learning approaches for solving code-related tasks such as code completion, edit completion, predicting method names, and automatic documentation generation.  My main interest is how to represent the code in such tasks. I’m also interested in methods for processing graphs using deep neural networks.

Publications

Accepted Papers

FuseCap: Leveraging Large Language Models to Fuse Visual Data into Enriched Image Captions

On the Expressivity Role of LayerNorm in Transformers’ Attention

How Attentive are Graph Attention Networks?

A Structural Model for Contextual Code Changes

code2seq: Generating Sequences from Structured Representations of Code

Technical Reports

BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

Patents

Loading Deep Learning Network Models for Processing Medical Images

  • Hans Harald Zachmann, Simona Rabinovici-Cohen, Shaked Brody
  • [PDF][BibTeX]

Service

  • Program Committee: Deep Learning for Code workshop (2022, 2023), MSR’2021 Mining Challenge
  • Reviewer: NeurIPS (2023), ACL (2023)

Awards

  • 2023 – Department Excellence Scholarship
  • 2023 – Excellent Faculty TA
  • 2022 – Department Excellence Scholarship
  • 2019 – Dean’s Excellence  Scholarship