Green check
Link copied to clipboard

Why Empowering Women in AI & Data Science Is Important

As the world is increasingly being built around automated systems, where do women fit in?

Facebook logoTwitter logoLinkedIn logoCopy-link symbol

Businesses are adopting artificial intelligence faster than ever before to simplify processes. For example, AI can be used to automate customer service tasks, help doctors diagnose diseases, improve search engine results, control self-driving cars, etc. The list goes on and on...

As AI becomes pervasive in everyday life, the question of diversity and inclusion in technology remains a significant concern. In particular, the persistent under-representation of women in data science and AI, including gender data gaps, leads to the encoding and amplification of bias in technical products and algorithmic systems, creating harmful feedback loops.

“To be truly diverse you need to bring people into AI that think differently.”
Kay Firth-Butterfield
Head of AI & Machine Learning and Member of the Executive Committee

AI is one of the fields in which women can experience tremendous success, especially with the right push toward female participation in the industry.


Introducing Lians Wanjiku, Data Science and Machine Learning enthusiast. Here, we’ll take a walk through her journey into data science and inspire young women to join the tech movement.

Lians is a senior-year student and research assistant intern in the data science center at the Dedan Kimathi University of Technology in Kenya.

Taking notice of how simple it is to extract insights from data, Lians interest became sparked in Machine Learning. She joined a data science community about a year ago and has taken a keen interest in pursuing it as a career. To Lians, it is amazing how data science and AI drive the future!

Detecting Zebras with YOLOv5

YOLOv5 for Detecting Animal Species

Detecting Impala with YOLOv5

Lians only got started with YOLOv5 several months ago! Working with images of various animal species, the main goal of working with the YOLOv5 as an object detection model was to classify the animal species in her school’s conservancy. Later in the project, she realized that after classification, the model could automatically annotate all the images. This makes it easier to reduce human effort and save time annotating images.

Lians also experimented with other pre-trained object detection models, such as TFOD and YOLOv3, because initially, she needed to gain knowledge and skills in PyTorch. However, after finding YOLOv5 through research, she quickly implemented it. For Lian, the model performs the best as it is lightweight, simple to use, and provides the best accuracy.

“The best part is you can get started with just a few lines of code!”

The Value in YOLOv5

  • Data augmentation
  • Inference speed
  • The fact that the model is available in several variants (s, m, l, and x), each with a different detection accuracy and performance made it easier for her.

Lians recommends YOLOv5 for anyone who is new to this field. In her words, "YOLOv5 was built for object detection, so it's good at what it does! Because there are fewer operations and less code to write, YOLO is one of the most well-known object detection algorithms due to its speed and accuracy.'

Lians is open for collaborations on GitHub and available for a chat on Twitter, she also publishes articles on projects she is working on. Check out her article: Introduction to Object Detection with YOLOv5!

I deployed the object detection model on some videos with both zebras and impalas and.... From this perspective I think I'm going to have to go back to the kitchen and work with more data and perfect the model. #100daysofcoding @ultralytics #objectdetection @WomenInDataAfri

— lian.s__ (@lians___) November 29, 2022

Thanks for reading about Lians' experience. As Ultralytics, we look forward to more women joining this field. We will continue to make AI easier for everyone, stay tuned!

Let’s build the future
of AI together!

Begin your journey with the future of machine learning

Read more in this category