Posts Tagged ‘Teaching’

Testing an O(n log n) running time

Tuesday, May 4th, 2010

This morning I noticed that this blog had been hit by this search term:

I'm not sure why it hit my blog since I have never written about that (as far as I recall), but I know that my current students have to worry about an running time algorithm for their current project.

They have to implement an algorithm for finding tandem repeats, and the essential part of the algorithm is locating "branching" tandem repeats in time .  To validate that they get it right, I ask them to run some profiling and describe how they test this.  Perhaps it is my students googling to figure out how?

Well, depending on how you think about this, it is either very simple or very hard to validate an running time.

What does the big-O notation really mean?

When we have an algorithm that runs in time , what we mean is that the worst case running time on input of size is (eventually) dominated by for some constant . So if we consider the running time for all input of size and take the maximum of that, that is the worst case running time.  Let denote this.  Saying simply means that there exists an such that .

Sometimes we don't want to argue about worst case running time. For example sometimes we worry more about the expected running time, then the function would take the expectation rather than the maximum over all input of size .  If this is the case, we make it explicit, so whenever we talk about without any qualifier, we are talking worst case running time.

By this definition, doesn't mean that is some function or anything like that.  The big-O notation only puts an upper bound on , it doesn't say that this upper bound is tight or anything.  We have other notations for that, but you don't see it so much out of algorithmics circles.

Anyway, to simplify matters, I am going to assume for the following that is a tight bound, and by this I mean that for some constant .  Here the little-O notation means that there might be some extra time consumption in but that tends to zero as tends to infinity.

Can you prove a certain running time with experiments?

For an algorithm, we mathematically prove that the worst case running time is dominated by some .  We don't experiment our way out of it.  But if we wanted to, could we?

Let's assume for a second that we don't have to worry about all input of size to compute - and that's a big assumption in many cases, since many algorithms have different running time depending on what the actual input is, so the running time isn't really a function of just the size of the input.  If we really did have to worry about this we would need to run our program on all possible input of size for each we want to examine, and the combinatorial complexity of this would soon make it impossible to even attempt.

So let's assume that the running time only depends on the size of the input. Could we then distinguish between, say and ?

If it is pretty hard!

ns <- seq(1,1000)
t1s <- 5 * ns * log(ns)
t2s <- 5 * ns * log(ns) + 1e-4 * ns**2

plot(ns, t1s, type='l', col='blue', xlab='n', ylab='t(n)')
lines(ns, t2s, col='red')

The two are pretty close!  And that is ignoring measurement errors that would muddy the picture further.

Of course, eventually, for some large , the term will dominate the term, but for the range you measure it might not.

With your experiments you are best just validating that the algorithm, mathematically proved to be was implemented correct and appears to be running in time (but again remember that this is assuming that only depends on and not the actual input and that i.e. that the bound is tight.

Validating an n log n running time

So how do you validate that ?

Well, if then , so just plot and check if it looks like a line!

ns <- seq(1,1000)
ts <- 5 * ns * log(ns)

plot(ns, ts / log(ns), type='l', col='blue', xlab='n', ylab='t(n)/log(n)')

Simple, eh?

What about the base of the logarithm?

In the algorithm my students are implementing, the in isn't the natural logarithm.  It rarely is in computer science.  It isn't even the base-2 logarithm as it often is.  The base actually depends on the kind of input (but let's keep ignoring that).

What do you do when you don't know the base of the logarithm and you want to validate an algorithm?  Do you need to divide with the right base log to get a line?

Of course not.  Remember , so if the running time is with a log base : , and you plot it with a log base , then is still a line.

ns <- seq(1,1000)
ts <- 5 * ns * log10(ns)                # base 10 logarithm

## plotting with base 2 logarithm
plot(ns, ts / log2(ns), type='l', col='blue', xlab='n', ylab=expression(t(n)/log[2](n)))

That is why, when we write the complexity as we don't bother with the base.  All logs are the same under a big-O.

In practice, of course, the base of the logarithm matters as it affects the constant in front of the , but for the big-O notation it doesn't, and for validating that your program runs in time it doesn't either.

Writing lecture notes for "Applied Programming"

Monday, November 2nd, 2009

Once again I find myself writing lecture notes for my programming class Applied Programming.

It is a lot of work, and I really would prefer not doing it and using a text book instead, but all the text books I have seen for Python programming falls in one of two categories: Either they assume that the reader is a moron who has to have everything explained in every single little details over close to one thousand pages, or they assume that the reader already knows how to program and just needs a quick introduction to Python.

Neither really fits my purposes.

I have seven weeks to teach basic script programming, so I only want to cover the bits really essential for that, and since I teach mainly biology and medical students I can safely assume that this is the first time my students are exposed to any kind of computer programming.

It is the second time I teach the class, and last year I tried using a text book supplemented with lecture slides, but that didn't really work so well, so this year I am going to write enough lecture notes to cover the material I need.

I had to write my own lecture notes in previous programming classes as well, but those classes was for people familiar with the basics of programming and Python who just needed to know a little extra stuff useful in bioinformatics, so those were completely different notes and I cannot reuse any of it this time.

I plan to spend one day a week on lecture notes this time around.  I don't have time for more.  We will see how that goes, and how much I need to add on top of it next year, where I can take the next iteration over the notes.

306-321=-15

I guess it's a cultural thing

Wednesday, September 9th, 2009

John Hawks gives advice to college freshmen:

Third, never, ever, under any circumstances call a professor by his or her first name. You may think he's cool with that. You're wrong. And if he is cool with that, well, you've just transported yourself into one of those British mysteries where the Oxford professor is always surrounding himself with handsome young undergraduates.

I recognize the description there at the end, but as far as I noticed, everyone was on a first name bases in Oxford while I was there... but of course, I was there as a post doc and might not have noticed what happened around undergrads.  It is a very hierarchical place, so it could be true.

Anyway, around these parts it is direct opposite.  Don't call your professor by last name.  That would be considered odd.  Everyone here is on a first name bases.

Well, my friends call me by last name, but that is more of a nickname than to be formal.  My very best friends call me Dr. Mailund.  Again, that is in jest not because my friends have much unusual respect for me.

It is more of a joke, really.  While I studied briefly in Adelaide, Australia, I noticed that all the door signs said Dr. This or Prof. That and I found that very funny, so when I came back I started calling people that.  Often to their annoyance, if anything.

It spread a bit as a joke, but never more than a joke.

It might be good to keep in mind in the US or UK, I don't know, but here you are better off calling your teachers by first name.

--

252-268=-16

Thursday, August 27th, 2009

Widmann writes about grade inflation in the English GCSE.  The raw numbers are actually pretty scary reading.  In pretty much all subjects, the grades have steadily gotten better over the years.  In many cases more than doubled.

Unless students are getting smarter or the teaching much better, this should be a cause for concern...

--

239-244=-5

Textbook publishing

Tuesday, August 25th, 2009

I have previously written about text books, their tendency to be rather pricey despite that they are not exactly profitable to the author.

Since writing a text book is never going to make me rich, if ever I do I think it will be an open source book available online.

Yes, I know that a publisher brings a lot of good to a manuscript, like careful review and typesetting etc. but I would think that if the book is received well and available for free, I will get similar feedback from users.

Anyway, one serious drawback from this approach is that downloading a PDF just isn't the same as buying a proper book that you can bring to the beach to read in the summer and that you can add notes in the margin and such.  Yes yes, I know I can annotate a PDF but it is no where as easy as a printed book.

So I was very excited to see this approach for text books: free books online and cheap printed versions.

That is surely the way to go for textbooks!

--

237-240=-3