The Perpetuation of Real-World Bias
During the hour-and-a-half panel discussion “Hack the Gender Gap: AI for All”, we barely scratched the surface. Still, one point worth mentioning, obvious yet powerful, boils down to this: AI just seems to replicate real-world bias. The tricky part is, though, how to break the vicious circle of biased data leading to biased outcomes. Our idea of approaching this problem was to unite experienced leaders and young talents in devising solutions that would put AI to ethical usages that deconstruct bias and promote inclusion and equality.Alleviating Bias Through Diversity
To this end, we organized HTEC Idea Marathon, a 42-hour hackathon for women students from all over Southeast Europe. For 42 hours, students worked hard with their HTEC mentors, discussing real-world negative implications of AI they’ve witnessed and conceptualizing ideas on how to put AI to use that would restore the balance. As one of our panelists, Neda Cvijetić, nicely put it, “diverse teams build the best products because they think about customers who look like them, think like them, and act like them.“ With initiatives like HTEC Idea Marathon, we strive to empower young women and inspire them to pursue a career in STEM, add to the diversity, and bring a new perspective and energy to the table.The Winning Solution: Be fAIr
We heard some great ideas, and the jury did not have an easy job deciding on the winner of the hackathon. After carefully evaluating all the solutions, they pronounced the winner: team brAIny. The team singled out the problem of online gender violence and the lack of awareness of this issue among the teenage population. Their solution is a tool designed to monitor, detect, and flag gender-based online violence on social media using an AI model based on NLP. The main idea is to raise awareness about this issue, educate the users, and create a safer, more respectful, and more equitable digital space on social media.