MIDS Capstone Project Summer 2021

Assistive Twitter Bot for Visually Impaired People

Our team is focused on improving the accessibility of social media for those who are visually impaired. There are an estimated 3 million visually impaired users of Twitter globally, but the options for these users to understand posted images are extremely limited or non-existent (0.1% of all images have alt-text explaining the image). Using existing image and caption datasets, we are training a multi-modal transformer model and are building a Twitter bot (@HelperText) as our MVP that can be called to caption images, intended for use by visually impaired users of Twitter. We intend to use our MVP to contribute to peer and industry learning about image captioning and as a tool to improve the accessibility of the internet.

More Information

Last updated:

August 5, 2021