Bharatanatyam is a traditional Indian classical dance dating back to the second century. As an avid Bharatanatyam dancer for the past 16 years, I’ve come to understand dance and choreography to such an extent that some may even say I’ve learned a different language. Bharatanatyam is a combination of different hand gestures (known as mudras) and facial expressions that form a type of sign language. Although a lot of dance connoisseurs may be aware of how to understand that language, showcasing Bharatanatyam to a global audience comes with its struggles because they may have never seen the art form before. During dance performances, it becomes clear that audiences that have not seen the dance before have a hard time understanding the storyline, further prompting a look into a solution to this language barrier between artists and their audience. The project proposed is to build a dataset with specific hand gestures found in Bharatanatyam and its English interpretation that could be leveraged to create an AI model to interpret the gestures’ meaning, hopefully making Bharatanatyam a more understandable artform. The data set will be made available on GitHub so others can contribute by submitting a picture and its correlating definition. To make sure the data is of good quality, we will take high-quality photos of experienced professional dancers, which will be submitted with an associated description such as the pattern, fingers used, ect. By stringing pictures into a storyline, we are able to translate ancient Indian drama and mythology into transmissible stories. And it doesn’t have to stop with Bharatanatyam. Multiple other forms of storytelling exist in our world, and this would make it easier to connect different cultures and histories.