E02: How 2 predict grades badly…

In this episode, we discuss some of the mistakes that Ofqual made in their algorithm, how using “complicated” maths is not necessarily better, and share some anecdotes of their experiences with teachers and dealing with (un)conscious bias.

Timestamps
00:20 – Introduction
01:54 – Initial thoughts
02:42 – Mistake #1 – Their approach
04:43 – Mistake #2 – Data leakage
05:15 – Mistake #3 – Emphasis on the rank
06:57 – Mistake #4 – Ignoring outliers
08:31 – Mistake # 5 – No peer review
09:16 – Mistake #6 – Too precise
11:14 – Mistake #7 –Disregarded unconscious bias.
12:53 – Mistake #8: Education system in the UK.
13:30 – Ofqual considered edge cases – (almost a positive thing!)
15:00 – How we might have handled this situation
17:39 – Another example of algorithmic bias – Accounting system the Post Office used.
18:53 – Challenge: “Prison Break”. This based on “Liar’s paradox” attributed to Epimenides (amongst many other philosophers). For more challenges, presented in a more visual manner, check out our Instagram.
25:52 – Anecdotes of experiencing bias from teachers.

Useful links:
Ofqual’s reportBristol University’s study on unconscious bias 
http://www.bris.ac.uk/media-library/sites/cmpo/migrated/documents/wp221.pdfTom SF Haines’ post (Lecturer in Machine Learning at Bath University) – http://thaines.com/post/alevels2020

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s