Genderless

Our group has produced several models and diagnostic methods for addressing gender bias in natural language processing and computer vision. Here we leverage our ICCV 2019 paper: Balanced Datasets Are Not Enough: Estimating and Mitigating Gender Bias in Deep Image Representations. In this paper we proposed a method to adversarially remove as much as possible from an image any features that could be predictive of whether a person will use a gendered word to describe it. We used a large dataset of images with captions and selected images that had references in the text such as "man" or "woman" and trained a model that can recognize the objects in the image but has as much difficulty as possible in predicting gender. When we applied this transformations to the image space, we can examine what the model is trying to do. Try your own images below and see what it does.

Here we leverage our ICCV 2019 paper: Balanced Datasets Are Not Enough: Estimating and Mitigating Gender Bias in Deep Image Representations to remove gender information -- as evidenced by textual references to gender-- using an adversarially trained neural network.

Original Image

original image is here

Genderless Image

output image is here
Demo by Lindsey and Vicente
Gallery of examples
Technical Notes: This demo is running in a CPU-only cloud instance, hence the inference time it takes to run this demo is between 10 and 15 seconds ignoring preprocessing and upload/download times. The demo relies on our model which was part of our ICCV 2019 paper: Balanced Datasets Are Not Enough: Estimating and Mitigating Gender Bias in Deep Image Representations. Almost the entire processing time is taken by the underlying UNet model used as an autoencoder to generate the output mask. The images uploded through this demo are not stored in our server or anywhere (not even temporarily), we only hold images on the server-side in volatile memory while they are being processed and return the resulting image as base64 encoded strings directly to the user's browser. Any images presented here as examples were not uploaded by a user but were images directly uploaded by our team. This is not a demo that aims to collect any data from users.