In this last episode of the season we continue our discussion of dynamic programming, and show just how efficient it can be by using the Fibonacci sequence!
[00:00:02] SY: Welcome to the Base.cs Podcast where we explore the basics of computer science concepts. I’m your host Saron, Founder of CodeNewbie.
[00:00:09] VJ: And I’m Vaidehi Joshi, Author and Developer.
[00:00:11] SY: And she is the brilliant mind behind the Base.cs Blog Series. Today, we are continuing our conversation on…
[00:00:17] VJ: Dynamic programming.
[00:00:19] SY: This season of the Base.cs Podcast is brought to you by Educative. Educative has a ton of online coding courses, including how to become a machine-learning engineer and a practical guide to Kubernetes. The comprehensive and interactive text-based courses give you in demand tech skills. And since it’s text-based, there’s no scrubbing back and forth video to find the content you need. So learn better and faster with Educative and get a 20% site-wide discount by going to educative.io/basecs.
[00:00:56] SY: So we started talking about dynamic programming last episode. It’s also known as dynamic optimization. Can we do a quick recap? What is that?
[00:01:05] VJ: Dynamic programming is basically an approach to algorithm design. And the approach is effectively that we can solve complex problems by breaking them down into smaller versions of those problems. So breaking them down into smaller parts. And with dynamic programming, once you break down a large problem into effectively a sub problem, you can store the results of a sub problem and then you basically only need to do the hard work once and then you can use those results later on.
[00:01:37] SY: And memoization is a big part of dynamic programming as well, which is awesome. And memoization, I love because it’s like I’m writing a memo.
[00:01:45] VJ: Yeah.
[00:01:45] SY: Right? Like I’m like keeping track of all these things and like writing little notes and that’s really important in dynamic programming as well.
[00:01:51] VJ: Yeah, that’s the storing the results bit of how it works.
[00:01:55] SY: Okay. So we’ve been working with the Fibonacci sequence, which just to remind everyone, is when you add the previous two numbers to get to the third number. So for example, we start with zero, then we add one, zero plus one is one, one plus one is two, one plus two is three, and so on and so forth. So when we’re calculating the Fibonacci sequence, we’re doing dynamic programming, is that right?
[00:02:15] VJ: Well, yeah, we can use dynamic programming if we want to be a little bit smarter about things. But I think before we get into the dynamic programming bit, in order to really drive home the point, it’s worth understanding what it’s like to calculate the Fibonacci sequence to solve this big problem without dynamic programming, which basically means imagine that we don’t know about dynamic programming and we were approaching this problem. And we weren’t using memoization, what would that time complexity be? Because if we look at that version, I think it’ll become really apparent to us why it’s bad.
[00:02:51] SY: Yeah. Yeah. Yeah. Okay, that makes sense. So we know that the basic formula for getting to that third number, let’s call it T(n), is we have to look at the number before it. So T(n-1), and then the number before that, which is T(n-2). So the formula is T(n) equals T(n-1) plus T(n-2)?
[00:03:14] VJ: Yeah. And basically what we’re really trying to do here with where you’re talking about T(n), we’re really talking about, because we’re talking about space, time, complexity right now, we’re talking about how much time it will take for us to solve for whatever N is, so that could be like the fifth element in Fibonacci. So how much time T would it take for us to solve for the fifth element N? So really T(n) is the time to solve for an element. And so you’re right that the formula to figure out how much time it’s going to take for us to solve it totally depends on how much time it takes to solve the element before it, n-1, and then the element before that, which is n-2.
[00:04:00] SY: Right.
[00:04:00] VJ: However, it’s also worth noting that we need to solve for the two numbers in the Fibonacci sequence before it, but also we need to sum those numbers and then we need to return the value of that. So there’s like a tiny little bit of computation we need to do, but that’s effectively constant because it doesn’t really matter if you’re solving for the fifth element or the 500th element in Fibonacci because doing addition and then returning the value is like the amount of energy and time it takes to do that is pretty negligible in terms of computing. So we’re just going to say that that’s O of 1.
[00:04:37] SY: Okay.
[00:04:37] VJ: Constant time. So the real amount of time it takes to solve for T(n) is T(n-1) plus T(n-2) plus O of 1, which is that constant time for addition, which is really you can almost forget about it, but I don’t want to forget about it just because it’s important to remember that we have to do it. It’s not going to impact how much time it’s going to take by that much.
[00:05:00] SY: Okay. So looking at that formula, how do we describe the time complexity?
[00:05:07] VJ: I think what we want to do here is we really want to look at the part of this little formula of solving for some number in the Fibonacci sequence. We want to look at the harder part of that, so it’s not the O of 1 part that’s hard, it’s the T(n-1) plus T(n-2). That’s the harder part because basically what we want to do is we want to recursively solve for the previous element and the elements previous to the previous element, which is the n-1 and the n-2.
[00:05:38] SY: Right.
[00:05:38] VJ: And really what this formula is telling us is that we are doing two recursive versions of Fibonacci within this one problem we’re trying to solve. And the other thing it’s telling us is that when we’re recursively solving for n-1 and n-2, well, what we’re really doing is we have two inner algorithms that we’re trying to do. We have to solve for T(n-1) and then also we have to solve for T(n-2) and so we’re doing two algorithms at the same time recursively, which really should be like a red flag in the air.
[00:06:14] SY: It sounds intense. Yeah.
[00:06:16] VJ: Yeah, it’s not good because basically you’re now doing something that’s like… it’s actually worse than exponential. What that ends up being, if you wrote out what it would take to solve for T(n-1) and what it would take to solve for T(n-2), you’d figure out that it would be O of phi raised to the N.
[00:06:39] SY: Phi.
[00:06:41] VJ: And you’ll remember back in Season 7, Episode 2, we talked about phi and its relationship to the golden ratio and how the golden ratio is related to Fibonacci. So the important bit really is if you want to just ignore the whole five bit, it’s the fact that we’re solving for something that is raised to the power of N, like anything raised to the power of anything is just bad.
[00:07:04] SY: Yeah.
[00:07:05] VJ: Yeah. If it’s raised to the power of two, it’s like, okay, it’s like exponential, but raised to the power of N is especially bad because basically this algorithm is going to like… as it grows, it’s going to double with each addition to the input dataset. So if you now have five elements versus six elements, that’s a huge difference because now your whole run time is like doubling. It’s not good. Anytime you see exponents, you should be like, “Oh!”
[00:07:32] SY: Yeah.
[00:07:33] VJ: Not like in life, just with just one Big O notation.
[00:07:36] SY: Constantly run away from exponents. That is the lesson to learn from this episode. Okay, got it. Okay, so that’s really bad for time complexity. I’m assuming that our dynamic programming, our dynamic optimization, taking advantage of memoization, I’m assuming it gets us something better than exponential.
[00:07:55] VJ: Oh, yeah.
[00:07:55] SY: Okay. So with memoization, what happens?
[00:08:01] VJ: Well, if we think about this, if you had a notepad and you calculated something for that notepad, but let’s say you like ignore the fact that you had to do this hard work of calculating it. Once you’ve calculated it and you wrote it down, how hard is it to look it up in your notepad that you wrote something down?
[00:08:17] SY: Not that hard. It’s right there.
[00:08:18] VJ: Yeah. It’s actually, if you have like one element or one thousand elements, it takes the same amount of time, right? Because you’re basically like, “I just have to look it up in my little notepad.” So basically the cost of memoization, the cost of looking up a memoized value, once you have it memoized, that takes constant time, that’s known as O of 1.
[00:08:38] SY: Okay.
[00:08:39] VJ: So finding those remembered values is not very hard. It doesn’t cost us a lot, right, because it’s memoized. Now we can’t just magically end up with memoized values, right?
[00:08:49] SY: Right.
[00:08:50] VJ: We have to do some work.
[00:08:51] SY: We’ve got to calculate them.
[00:08:53] VJ: Yes, we’ve got to calculate them. Notably, you only got to calculate them once, which is really nice because say we’re like, “Okay, let’s find the 10th element in Fibonacci.” Okay, you’re going to have to calculate elements nine, eight, seven all the way down until you figure out the whole sequence.
[00:09:10] SY: Right.
[00:09:11] VJ: But once you’ve done it, you don’t have to do it again. So if I had to say, “Okay, find the 11th element,” way easier.
[00:09:17] SY: Way easier. Yeah. We’ve done most of the work already considering our memo pad.
[00:09:21] VJ: Yeah, exactly. So really the thing that we care about is really not the memoization, because we know that's a constant amount of time, we care about all the non-memoized calls. So basically, all like the initial calculations we have to do. So the first time that we have to solve for Fibonacci for some number. So it’s like you’ve got to do the calculation to figure out the Fibonacci sequence once, and that’s going to completely depend on how far through the sequence you’re counting. So if you’re looking for the 10th element, then it’s going to take the amount of time, like completely depends on which element versus the hundredth element. So non-memoized calls to the Fibonacci function completely depend on the size of it, which basically means it’s linear, O of N.
[00:10:08] SY: Oh, that’s much better.
[00:10:10] VJ: Yeah. Effectively, like we basically replaced that whole T(n-1) plus T(n-2) with a much nicer formula, which is basically O of 1, which is the memoization plus O of N plus O of 1, which was the original, you know, I have to do some addition and return of value.
[00:10:30] SY: Right.
[00:10:31] VJ: And because O of 1 is negligible, we can basically say that with memoization, recursive Fibonacci just takes linear time, which is O of N.
[00:10:41] SY: Oh. Nice! What a relief.
[00:10:43] VJ: Yeah. We went from this terrible way worse than exponential situation to something linear.
[00:10:50] SY: Nice. Okay. So that is talking about time complexity. What about space complexity?
[00:10:57] VJ: So the space complexity completely depends on how many things your memoizing. So if you’re memoizing a few elements, then the amount of space you actually need is much less. But if you’re memoizing more elements, then you need a slightly more space. But again, like as I mentioned, memoizing something on a memo pad is actually not very hard because really it’s negligible given the fact that you’re just recording a value once you have calculated it.
[00:11:28] SY: Right.
[00:11:28] VJ: So the space complexity, like you do have to sacrifice a little bit of space because you have to memoize parts of the Fibonacci sequence so that you can remember those previous sub-problems, like if I want to find the 10th element, I have somewhere in memory, in space, I have to record elements one through nine. So you’re sacrificing some space, but really it’s so worth it because you don’t need that much space and the space that you’re using is so worth it because you’re no longer needing to do those calculations in a silly way. It definitely becomes worth it to sacrifice a little bit of space in order to save that much time.
[00:12:07] SY: That’s good news. I will take that sacrifice.
[00:12:09] VJ: Cool.
[00:12:09] SY: So dynamic programming, so we’ve walked through what it means. We walked through the Fibonacci example and how we implement it in the Fibonacci example. Are we basically done? Is that kind of the memoization being like the crux of what dynamic programming is?
[00:12:29] VJ: Yeah, I think there’s one other thing worth noting, which is that in this whole Fibonacci example that we talked about in this episode and also last episode, we sort of only explored one approach to dynamic programming paradigm.
[00:12:43] SY: Oh, there’s multiple approaches.
[00:12:45] VJ: Well, at least with dynamic programming, there’s basically two ways that you can approach it. There is the top-down approach and then there’s the bottom-up approach.
[00:12:53] SY: Okay.
[00:12:53] VJ: And we’ve only looked at one with our Fibonacci example. We looked at the top down approach.
[00:12:58] SY: Okay.
[00:12:59] VJ: And with the top down approach, you start with the large complex problem and then you understand how to break it down into smaller sub-problems, and then you memoize the problem into those little parts.
[00:13:09] SY: Okay. So that’s when we were saying like, “Let’s look at the 10th element,” and we’re like, “Uh-oh! To get the 10th element, we need the ninth and eighth. Oh, to get the ninth, we need eighth and seventh.” Like that’s what we mean by top-down?
[00:13:19] VJ: Yes.
[00:13:19] SY: Okay.
[00:13:19] VJ: Yes. Yeah. Yeah. Exactly. The big problem, the large problem was to find the 10th element and then we were like, “Okay, break it down into smaller step problems.”
[00:13:27] SY: Right.
[00:13:27] VJ: And then we memoized along the way.
[00:13:28] SY: Cool.
[00:13:29] VJ: So if that’s the top-down approach, I bet you could guess what the bottom-up approach would look like.
[00:13:33] SY: Okay. So the bottom-up approach must be starting from the little bitty building blocks and then working our way up to the 10th element. So it might be finding the first element or to zero, the second element, which is one, the third element, which is one, the fourth element, which is two. I’m not rubber, but I think that’s what we’re talking about, right?
[00:13:51] VJ: Yes, you’re totally right. And in fact, when you were explaining in like the top of the episode you talked through for Fibonacci, this is how it works. You start with these numbers and you were adding them up. That was basically a bottom-up approach in the sense of you started with the smallest possible sub-problem and you figured out the solution to those and then you built it all up, just with the larger, more complicated problem.
[00:14:14] SY: That seems like the natural way to do it, like starting from the 10th element, it doesn’t even sound right to me, at least not for Fibonacci.
[00:14:20] VJ: Yeah. I think I actually agree with you there. Intuitively, we feel like, you know, I know we just start from the beginning and on our way up, but I think in other kinds of computing, like maybe other types of algorithms or other problems, you don’t even know necessarily what the smallest possible sub-problem is because you can only sort of wrap your head around the larger complex version or maybe it’s not that you can’t… you’re trying to wrap your head around it, but it’s like, “Okay, this is the problem. Let me break it into two parts.” Rather than let me figure out what the smallest possible part is. You might not even know that to start with. Sometimes you have to start with the top-down approach because you don’t know what is at the bottom.
[00:14:59] SY: Yeah. Okay.
[00:15:00] VJ: And if you take the top-down approach, you can figure it out.
[00:15:02] SY: Okay. Neat.
[00:15:03] VJ: So the bottom-up approach that we didn’t really talk through, I can imagine, you know, all the listeners probably can, you know, sort of guess how that would go. There are a couple of benefits to that.
[00:15:14] SY: Okay.
[00:15:14] VJ: First, if you’re going from bottom-up, a really nice thing is you don’t actually need to store all the previous values.
[00:15:22] SY: Oh, cool.
[00:15:23] VJ: Because you’re starting from the bottom, right? So if you’re looking for the 10th element, you don’t really need to remember what was the third element, really only you’re trying to store the last two. So you can sort two and three and then three and four, and then four and five until you get up to eight and nine and then you have your answer what’s the value for the 10th number in Fibonacci. And the nice part about this is that if you don’t need to store all the previous values, that also means you don’t need a super large data structure to store your memoized values. You really only ever care about two. So you can use a smaller data structure to store two values.
[00:15:56] SY: Cool.
[00:15:57] VJ: And then since you only need to memoize the previous two values, now the space complexities problem is also way nicer to solve because instead of trying to memoize possibly end number of pieces of information, you’re just trying to memoize n-1 and n-2 when you get there and that keeps changing as you work your way from the bottom up.
[00:16:21] SY: So in general, if I’m trying to pick between my bottom-up and my top-down paradigm, should I lean towards bottom-up?
[00:16:29] VJ: If you know what the smallest possible sub-problem is, yeah, because you know what the bottom is. But again, if you don’t know what that is and you’re like, “All I can do is break apart my complex problem into two parts,” really probably what you want to do is go for a top-down. You know more about the top than you do about the bottom.
[00:16:47] SY: Yeah.
[00:16:47] VJ: But with Fibonacci, we know how to construct it. We know how to work from two numbers, zero and one, all the way till infinity.
[00:16:55] SY: Cool. And that’s the end of today’s show. If you liked what you heard, please leave us a review and make sure to check out Vaidehi’s blog post. Link is in the show notes. I also want to give a huge shout-out to Educative, the text-based online coding course platform. My producer and I got a chance to give it a try. I took “become a front-end engineer”. And Levi, what did you take?
[00:17:17] LS: I took “Learn Python from scratch”.
[00:17:20] SY: And what do you think?
[00:17:21] LS: I thought it was really great. So I’ve been learning Python a little bit and this sort of like…
[00:17:26] SY: Because I’m making you.
[00:17:27] LS: Because you’re making me. True. You’re not making me learn Python specifically.
[00:17:31] SY: That’s true.
[00:17:32] LS: But it is what I chose. So I was a little bit familiar, but I found that these courses, they’re laid out in a really intuitive way, there’s like the sections that lead seamlessly one into the other, the Python 1 goes from data types and variables to conditional statements, functions, loops, and then each section has a quiz to make sure that you’re not just blasting through and like the information is going one way and not the other.
[00:18:01] SY: I love the quizzes.
[00:18:02] LS: Yeah. It really called me out on my BS.
[00:18:04] SY: Yes.
[00:18:06] LS: I was like, “Yeah, yeah, I get it. I get it. I get it.” And then I took the quiz and they’re like, “You don’t get it.” And I was like, “Aha!”
[00:18:13] SY: You got me.
[00:18:13] LS: You got me. And then throughout all of these different sorts of sections, you can code within the service itself. So you don’t need like an external coding thing. I should know what that’s called. Do you know what that’s called?
[00:18:29] SY: I’m not going to tell you. I’m not going to give you an IDE.
[00:18:30] LS: No. Yes. That’s what it says, an IDE. Did you know I’m a producer for a coding podcast?
[00:18:37] SY: A technical podcast.
[00:18:39] LS: Yeah.
[00:18:39] SY: Two actually, two technical podcasts.
[00:18:41] LS: That’s true.
[00:18:42] SY: So if you are interested in learning how to code, make sure to check out Educative and you can actually get 20% off by going to educative.io/basecs. That’s B-A-S-E-C-S. Check it out. This episode was edited and mixed by Levi Sharpe.
[00:19:00] VJ: Bye everyone.
[00:19:00] SY: Thanks for listening. See you next season.
Thank you for supporting the show!