Wednesday, May 13, 2026

The GPS of AI: Understanding Algorithmic Bias

Hello all,

This week I'm sharing a post from my LinkedIn feed from the amazing Tonya R. Bennett, MLA, ML (She/Her), one of my co-leads in the EDUCAUSE WIT Community Group.

She writes the following:

Let’s break down a term you keep hearing but no one really explains: “Algorithmic bias.”

Think of it like GPS:
You trust it to get you somewhere fast.
But it was trained mostly on highway data, not side streets.
So it keeps routing you the same way… even when a better route exists.

That is algorithmic bias.
The system is not broken. It learned from incomplete data.

Now the jargon, simplified:
🔹 Algorithm = rules the system follows
🔹 Training data = what it learned from
🔹 Output = the decision it gives you

When the data is limited or skewed, the results will be too.

Why this matters:
AI is shaping decisions in admissions, hiring, and student success.
These systems are not neutral.
They reflect the choices behind them.

👉 You do not need to build AI
👉 You do need to question it

Tonya's message is short, straightforward, and a clear call to action. If we want to continue on our path towards an inclusive technology related workplace and society, we need to be paying close attention and taking actions to reduce AI Bias.

All the best,
Holly

No comments:

Post a Comment

The GPS of AI: Understanding Algorithmic Bias

Hello all, This week I'm sharing a post from my LinkedIn feed from the amazing Tonya R. Bennett, MLA, ML (She/Her), ...