Do you ever feel like you never completely understood asymptotic notation and running time analysis? What exactly is the difference between Θ(n), Ο(n) and Ω(n)? In computer science, asymptotic analysis is used to compare the relative performance of different algorithms. The symbols used to represent these running times come from mathematics and have specific meanings. But usually the explanations in lectures and textbooks are really dense! I'm hoping this class gives you an intuitive idea of how it works and removes the mystery. If you're a software engineer you'll come across these concepts eventually, whether you're in school or building the next big framework.