The iron ML notebook
  • The iron data science notebook
  • ML & Data Science
    • Frequent Questions
      • Discriminative vs Generative models
      • Supervised vs Unsupervised learning
      • Batch vs Online Learning
      • Instance-based vs Model-based Learning
      • Bias-Variance Tradeoff
      • Probability vs Likelihood
      • Covariance vs Correlation Matrix
      • Precision vs Recall
      • How does a ROC curve work?
      • Ridge vs Lasso
      • Anomaly detection methods
      • How to deal with imbalanced datasets?
      • What is "Statistically Significant"?
      • Recommendation systems methods
    • Statistics
      • The basics
      • Distributions
      • Sampling
      • IQR
      • Z-score
      • F-statistic
      • Outliers
      • The bayesian basis
      • Statistic vs Parameter
      • Markov Monte Carlo Chain
    • ML Techniques
      • Pre-process
        • PCA
      • Loss functions
      • Regularization
      • Optimization
      • Metrics
        • Distance measures
      • Activation Functions
      • Selection functions
      • Feature Normalization
      • Cross-validation
      • Hyperparameter tuning
      • Ensemble methods
      • Hard negative mining
      • ML Serving
        • Quantization
        • Kernel Auto-Tuning
        • NVIDIA TensorRT vs ONNX Runtime
    • Machine Learning Algorithms
      • Supervised Learning
        • Support Vector Machines
        • Adaptative boosting
        • Gradient boosting
        • Regression algorithms
          • Linear Regression
          • Lasso regression
          • Multi Layer Perceptron
        • Classification algorithms
          • Perceptron
          • Logistic Regression
          • Multilayer Perceptron
          • kNN
          • Naive Bayes
          • Decision Trees
          • Random Forest
          • Gradient Boosted Trees
      • Unsupervised learning
        • Clustering
          • Clustering metrics
          • kMeans
          • Gaussian Mixture Model
          • Hierarchical clustering
          • DBSCAN
      • Cameras
        • Intrinsic and extrinsic parameters
    • Computer Vision
      • Object Detection
        • Two-Stage detectors
          • Traditional Detection Models
          • R-CNN
          • Fast R-CNN
          • Faster R-CNN
        • One-Stage detectors
          • YOLO
          • YOLO v2
          • YOLO v3
          • YOLOX
        • Techniques
          • NMS
          • ROI Pooling
        • Metrics
          • Objectness Score
          • Coco Metrics
          • IoU
      • MOT
        • SORT
        • Deep SORT
  • Related Topics
    • Intro
    • Python
      • Global Interpreter Lock (GIL)
      • Mutability
      • AsyncIO
    • SQL
    • Combinatorics
    • Data Engineering Questions
    • Distributed computation
      • About threads & processes
      • REST vs gRPC
  • Algorithms & data structures
    • Array
      • Online Stock Span
      • Two Sum
      • Best time to by and sell stock
      • Rank word combination
      • Largest subarray with zero sum
    • Binary
      • Sum of Two Integers
    • Tree
      • Maximum Depth of Binary Tree
      • Same Tree
      • Invert/Flip Binary Tree
      • Binary Tree Paths
      • Binary Tree Maximum Path Sum
    • Matrix
      • Set Matrix Zeroes
    • Linked List
      • Reverse Linked List
      • Detect Cycle
      • Merge Two Sorted Lists
      • Merge k Sorted Lists
    • String
      • Longest Substring Without Repeating Characters
      • Longest Repeating Character Replacement
      • Minimum Window Substring
    • Interval
    • Graph
    • Heap
    • Dynamic Programming
      • Fibonacci
      • Grid Traveler
      • Can Sum
      • How Sum
      • Best Sum
      • Can Construct
      • Count Construct
      • All Construct
      • Climbing Stairs
Powered by GitBook
On this page
  • Brute force
  • Memoization
  • Tabulation

Was this helpful?

  1. Algorithms & data structures
  2. Dynamic Programming

Fibonacci

Brute force

def fib_brute_force(n) -> int:
    """
    Time complexity: O(2^N)
        It's a recursive function, with a branching factor of 2. 
        This means, until arriving to N it performs two calls.        

    Space complexity: O(N)
        In recursive functions, we need to have into account the stack space 
        needed for every call. In this case, it will grow depending on N.        
    """

    if n <= 2:
        return 1
    else:
        return fib(n-1) + fib(n-2)

Memoization

def fib_dynamic(n, index = None) -> int:
    """
    Time complexity: O(N)
        With this approach, we won't make a recursive call if we already passed
        by that value. So, worst case scenario we will go over N iterations.

    Space complexity: O(N)
        For the same reasons, the stack space will depend on N.
    """

    if index is None:
        index = {}

    if n in index.keys():
        return index[n]
    if n <= 2:
        return 1
    else:
        index[n] = fib(n-1, index) + fib(n-2, index)
        return index[n]

Tabulation

def fib_dynamic_tab(n: int) -> int:
    """
    Time complexity: O(N)
        It only interates the array once

    Space complexity: O(N)
        It only requires an array of size N + 1
    """

    table = [0] * (n + 1)
    table[1] = 1

    for i in range(len(table)):
        if (i + 1) < len(table):
            table[i + 1] += table[i]

        if (i + 2) < len(table):
            table[i + 2] += table[i]

    
    return table[-1]
PreviousDynamic ProgrammingNextGrid Traveler

Last updated 7 months ago

Was this helpful?